Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.
Display:
A serious question: what important developments do you think came out of the Apollo project? I hear these kind of claims quite often, but there usually less meat on a closer look.

Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN? Especially considering that the 1000's of smart people who work on these projects wouldn't dispappear, but would be working elsewhere, on other projects with potential spin-offs.

by GreatZamfir on Thu Feb 21st, 2008 at 03:02:53 PM EST
[ Parent ]
Just to your second paragraph. Sure I think somebody else would have come up with something similar some years later. But I think the difference in economic impact if the www would have been invented 5 years later is already very big.

And on what project do you think these 1000s of smart people would have worked? Financial innovations? Micronukes?


Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 03:09:35 PM EST
[ Parent ]
I think they'd probably be working that fart problem Pierre mentioned. It sounds pretty serious, to me.

Il faut se dépêcher d'agir, on a le monde à reconstruire
by dconrad (drconrad {arobase} gmail {point} com) on Thu Feb 21st, 2008 at 03:30:23 PM EST
[ Parent ]
You are not seriously claiming 1000s of smart people wouldn't have had anything useful to do? What would have happened if LHC funding didn't go through? I am quite sure the people involved had other plans, not just the people directly involved but also all the people working for companies that supply to LHC.

As for HTML, why not invert that idea? Perhaps without CERN it would have been invented earlier... The point is that there is so little relationship between CERN's activities and HTML that it seems too strong to claim that without CERN, the WWW would have taken 5 years more.

After all, the Web depends not not just on HTML, but on a whole lot of interdependent technologies, both in hardware and software, that were growing in the 80's.

by GreatZamfir on Fri Feb 22nd, 2008 at 05:16:56 AM EST
[ Parent ]
You underestimate the importance of HTML in creating the web.

Particle physics had progressed so fast since the 1940's that the particle physics community had developed a system of "preprints" in which people circulated drafts of their papers to colleagues at their institutions and others months before they were published in journals. The story goes that Tim Berners Lee got tired of e-mailing documents back and forth to colleagues at CERN and decided to invent HTML and code a bare bones browser to allow him to (we would today way) webcast his research. There is something about the pace of information exchange within CERN and in the particle physics community that supports the idea that HTML might have taken 5 more years to be developed elsewhere (and it would have been some university or other: USENET and the text-based tools to go with it, and GOPHER, developed in that environment).

The large particle physics laboratories do employ thousands of physicists, engineers and programmers specifically for particle physics experiments purposes, and that is a nonnegligeable fraction of the respective academic communities. If the large labs didn't exist these people would be competing for academic jobs elsewhere and it would result in more people going to industry, as well as fewer people getting doctorates.

If LHC funding hadn't gone through, CERN would have stagnated and maybe shrunk. You need far fewer people to run the existing facilities than you do to develop a new facility, and the LHC research programme is much more intense that what can be carried out at the existinc facilities (not that that isn't useful, too, but it's on a smaller scale in terms of people and resources).

Consider CERN and the LHC a Keynesian stimulus package for physics and engineering.

We have met the enemy, and he is us — Pogo

by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:26:34 AM EST
[ Parent ]
The key thing about CERN was that the people who work there are spread across the planet a lot of the time: HTML - and more importantly HTTP - were designed to solve exactly the problem of sharing information with a widely dispersed geographical community all of whom would be publishing data. It followed on from gopher in some pretty obvious ways but was much less structured, which is its main beauty.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 05:33:14 AM EST
[ Parent ]
As an aside, it's only now, with people producing content all over the place that the original vision for the web is being fulfilled - the phase of company brochure sites was painful to watch.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 06:02:16 AM EST
[ Parent ]
And we're doing it by working around the shortcomings of the current publicaiton model, as well.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 06:02:55 AM EST
[ Parent ]
Thanks for these elucidations. To make it more general, could I say the idea is more or less "fundamental, difficult research is likely to encounter problems ahead of the rest of society, and is therefore relatively likely to find useful spin-off solutions" ?

After all, it is possible to predict in hindsight that CERN would be perfect to develop a useful hypertext sytsem. But if one wants to use the unexpected, unpredictable benefits of a project as one of the arguments for funding, there has to be a rationale why this particular project or field is especially likely to lead to unexpected benefits.

by GreatZamfir on Fri Feb 22nd, 2008 at 05:56:57 AM EST
[ Parent ]
In addition, "big science" projects tend to have engineering specs just outside what is possible when they are designed. LHC (and, before, LEP) have required faster electronics than existed at the time they were designed, efficient cryogenics, superconducting magnets, and so on. In that way, CERN drives technology development just like, say, the specs for the next generation of high-speed trains or the Shinkansen do. The same is true of NASA's plans for the next generation of space telescopes (including gravitational wave detectors).

So, big science drives technological developments in established fields, as well as occasionally resulting in new technology. [I distinguish two basic modes of technological progress: secular improvements in technology and new technologies - only the latter qualifies as "innovation" IMHO, and that is not predictable in the way that one can use, say, Moore's law when designing the specs of a computer system to be deployes 5 years in the future.]

We have met the enemy, and he is us — Pogo

by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 06:03:38 AM EST
[ Parent ]
A bit off-topic, but the improvement/innovation distinction is another view I am rather sceptical about. If you zoom in on the 'improvements', you usually see the same picture again: Some of the improvements are seen as radical changes in the field itself, some still look as gradual improvements. Zoom in on the gradual improvements, same picture: what looks as gradual improvement from the outside, is unexpected innovation closer up.

I would argue it's innovation all the way through. Some improvements change a subfield, and from the outside it looks as gradual, expected improvement. Some change a field, and the outside world can notice it and say it's something fundamentally different.  

by GreatZamfir on Fri Feb 22nd, 2008 at 07:07:32 AM EST
[ Parent ]
Well, actually, from the point of view of I/O models of the economy there's a distinction between whether an advance just changes the productivity/cost coefficients of the model, or changes its dimensionality by adding a new process or a new product.

The difference between the dynamical systems we are used to considering in physics and biological or economic evolution is the possibility of the system of differential/difference equations changing dimensionality in response to processes within the system itself.

We have met the enemy, and he is us — Pogo

by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 07:30:00 AM EST
[ Parent ]
I would consider this more an artifact of the modelling than a fundamental point about reality. After all, how do you determine when a new product adds a dimension, or changes existing coefficients? As long as a product is perfect replacement of some existing product, only better along an existing axis, that's easy.

But in reality, new products/inventions, even improvements on existing ones, are usually not that simple. They add an extra dimension, more freedom to find better solutions to problems. But in a high-level, low dimensional description, this freedom can be collapsed into a change in parameters, or really added as extra dimension, if the effects are important enough.

Funny thing is, I am currently working on shape optimization, where it is completely natural to change the number of parameters used to describe the shape, and thus the dimension of the problem.

A related field is order reduction, where you try to (locally) approximate a physical phenomenon by its most important modes. If there is a change in the physics, you can either modify the modes, but keep the same number of them, or you might find that for the new situation more modes are required to describe it well enough.

I would suggest this is a good analogy for your innovation/improvement distinction

by GreatZamfir on Fri Feb 22nd, 2008 at 08:07:51 AM EST
[ Parent ]
Well, a new dimension corresponds to a new manufacturing process, with different inputs. As long as there is substitutability you don't have "true" innovation.

I am familiar with dimension reduction (proper orthogonal modes, principal componets, factor analysis...) and you're right, at some level the number of variables is a matter of choice. But you still have to be able to close the system of equations. You can always ascribe the effect of all the neglected modes to "noise", though.

We have met the enemy, and he is us — Pogo

by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:10:26 PM EST
[ Parent ]
well you could say that they would create something similar if they were something else and that might be true but without the funding and supply of such a project they wouldn't be able to have the freedom of a living to develop these things. There's also a great deal of cross-collaboration in these things - if they aren't working in science or are working in smaller projects the chances of coming up with something spectacular are almost certainly lower.

One of the main things coming out of Apollo etc IIRC was the computer development of chips for the project. Large advances in microchips and material science filtered out to the outside world. Whilst industry might have got there as well, I'd say almost certainly it would get there slower, due to the very nature of business - a business looking at short term profit is far less likely to allow their researchers the time and space to create a bigger, more long-life project with its associated spinoffs.

Early research is expensive mainly because you don't know what the right solution is - it could be any number of different options and until you pick it you don't know, so there has to be a lot of investment without too much pressure on results immediately or in every route as a lot of them will be blind alleys - but without checking, you'll never know whether they are the right one or not.

by darrkespur on Thu Feb 21st, 2008 at 03:17:08 PM EST
[ Parent ]
But then the million dollar question is, why did the US spend the money on Apollo, and not directly on chip research? Especially as guided missiles needing chips were not exactly unimportant outside of the Apollo/manned space flight program.
by GreatZamfir on Fri Feb 22nd, 2008 at 05:01:44 AM EST
[ Parent ]
Three related reasons: Sputnik angst, there was a race with the Soviet Union in every field, and areospace technology development for military purposes.

But the fact is, the Apollo program was a one-shot thing. It was wound down and the US lost its ability to fly to the moon. It also discontinued its high-payload rockets in favour of the Space Shuttle, so now the rocket market is occupied by the European Ariane and the Russian Proton.

The Soviet manned space program made more scientific and technical sense than the American one, and the ISS owes more to the Russian Soyuz and Mir than to the American Skylab, which was also discontinued and folded into the Shuttle.

We have met the enemy, and he is us — Pogo

by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:12:22 AM EST
[ Parent ]
Yeah, I know. I am a final year student aerospace engineering, so I have heard my fair share of space histories...

One sidestory that I found particularly intriguing was a note between, I think, McNamara and Lyndon Johnson, in the early 60s. In it they discuss the budget surplusses they are expecting for the late '60s, and they fear Congress will call for tax reductions before they can use the surplusses for their Great Society plans. So in the meantime they think Apollo a good and popular method to keep the budget balanced until they have better things to do with the money. Then came Vietnam...

But more on topic, the whole 'spin-off' concept seems pretty much invented by NASA in later years to justify their budgets, and it is used for so many 'big science & engineering' projects when the wider public has doubts about the costs.    

by GreatZamfir on Fri Feb 22nd, 2008 at 05:38:00 AM EST
[ Parent ]
(1) Spend money on chip research for what? What would the chips be used for? The great proliferation of electronics came on the back of very advanced requirements for components for space programs, etc. One could argue that only once they had been developed for such purposes were it possible to consider their use for more mundane matters. The personal computer only became possible with a maturation of integrated circuit technology, computation infrastructure, and computational techniques that allowed for cheap mass manufacture. The driver for this technology were expensive research programmes in fields requiring processing of large data sets, such as say high energy physics research. Forget about the direct spinoffs, I would argue that the influences of these expensive programmes are far more subtile.  Technological diffusion requires that the basic building blocks are already lying about looking for a different application. You don't start developing processor technology because you think that in 20 years you'll be able to make a machine to play games on.

(2) Missile programmes you say? Because military programmes with large destructive potential are soooo useful while high energy physics, space exploration and the like are vanity projects! And you know how the military loves to share the technology it develops and would not like to keep it secret. One of the great advantages of the large lab high energy physics environment is exactly that it is not a military programme. We don't build things to kill people, I think this is a plus. Further, there is not a tendency to keep progress secret. Quite the opposite in fact, thus a greater chance that progress made here can defuse faster and wider.

(Disclosure, I work at CERN.)

by someone (s0me1smail(a)gmail(d)com) on Fri Feb 22nd, 2008 at 06:06:35 AM EST
[ Parent ]
In other words (and this ties in with my comments on the HTML subthread), technological progress is largely demand-driven. If you want progress you have to create demand for advanced technology. You can choose the form your keynesian stimulus will take: will it be bis science or big guns? And other public spending is also in the same category? Do you want to drive development of medical treatment? Improvements in construction techniques and materials? Improvements in transportation technology? Energy technology? The way to do this is to publicly fund projects which push the boundaries of what's possible. The private sector could do this, too, but they can't afford the solvency risk of sinking money into failed research. The public sector can. It's just a matter of priorities.

We have met the enemy, and he is us — Pogo
by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 06:42:09 AM EST
[ Parent ]
I think Apollo was a product, not a cause. After Sputnik there was a massive push towards science and engineering in the US, and Apollo fell out of that. So did most of the computer industry.

There's very little evidence to suggest that Apollo contributed directly to electronic design. The first patent for single-chip microcircuitry was granted in 1958. Computers with modular logic were built around the same time. Putting the two together was an next obvious step, and would have happened anyway.

Apollo was mostly a PR exercise for US science and engineering. There may have been some spin-offs in materials science and - obviously - rocket science. But Apollo hasn't left much of a trace in computer science history.

In fact it's rarely mentioned at all. Projects like SAGE, which was the first generation US air defence network, were much more important. NASA did buy a big IBM System/360 for Apollo, but System/360 was already around, and IBM were more interested in selling it as a tool for airline bookings than managing a space program with it.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 08:35:45 AM EST
[ Parent ]
One bit of hearsay lore that I picked up somewhere (probably on TV) is that the physical space constraints inherent in spacecraft design prompted Apollo scientists and related engineers in various indsutries to work on making things like transistors work in a practical setting, as the existing vacuum-tube technologies were simply too big.
by Zwackus on Tue Feb 26th, 2008 at 12:51:39 AM EST
[ Parent ]
I don't think it's about volume, weight is more likely, and I think it was mainly the Minuteman program that really required them. But I would suggest this was only a slight influence. People tried to build integrated circuits all through the 50s,and the first succesful ones  were somewhere around 1960. So there might have been a few years  when rocket programs were the main users,  between their development and first commercial use in the mid-60s.

Keep in mind that the big advantage of ICs, even those years, was the possibility to get prices down through mass production. Not really something the space program, or even Minuteman was very concerned about.

by GreatZamfir on Tue Feb 26th, 2008 at 03:57:56 AM EST
[ Parent ]
GreatZamfir:
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN?

That's a really hard question to answer. HTML didn't happen directly because of CERN, but it happened because CERN was an environment in which a quick mark-up system would be instantly useful, and because there was no need for 'research' to invent anything more complicated.

There were many, many alternatives to HTML, including horrible things from academia that are best forgotten.

I know people who were researching them, and while they were often better than HTML in many ways - e.g. no broken links - they were also wretchedly overcomplicated, with limited public appeal.

So HTML might well have never happened in its current form. We could easily have had some kind of Windows-ish or other system of gargantuan complexity and slowness.

If you look at academic vs 'public' computing there's a clear pattern of highly abstracted command line tools in academia (e.g. Latex), and much simpler WYSIWYG colouring-book computing in the public area.

HTML broke that pattern by doing something script-ish but relatively simple inside academia, which subsequently escaped into the wild.

That hasn't really happened before, which I think means it's not something that could be relied on.

Or in other words - it's likely CERN got lucky.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Thu Feb 21st, 2008 at 06:01:00 PM EST
[ Parent ]
The semiconductor industry would likely be a few decades behind its current state if there had not been military and space applications for the transistor when it was invented. The reason is the gap between military and commercial viability, defined mostly by cost, and arguably by transistor size (and thus integrated circuit complexity) as well. That gap was filled with public money in the form of the US military budget. The industry grew, and at some point commercial viability started to grow out of that, and henceforth the industry could be sustained as such.

Had that scenario not occurred the industry would not have existed until advances in other sciences / industries would have created commercial viability for it indirectly.

you are the media you consume.

by MillMan (millguy at gmail) on Thu Feb 21st, 2008 at 06:05:40 PM EST
[ Parent ]
I'm not sure it's a clean as that.

Mainframe computing was established by the late 50s, and mini-computing was just starting up. The market was already worth $billions by then. There were some prestige military projects - e.g. SAGE again - and a lot of DARPA funding for research. But the civilian market was already huge, with its own momentum.

Once TI introduced TTL logic in the early 60s, computers became a lot cheaper. At the same time a strong hobbyist culture fanned by magazines kept interest in technology running very high, so there was a steady stream of wannabe engineers with experience of digital techniques from their early teens.

Microprocessors were already being planned in the mid-60s. The biggest gap was between commercial computing and the microprocessor market, and that was bridged by developing a general purpose microprocessor and putting it into a commercial product - a dekstop calculator. It wasn't a military project.

Now you had hobbyist/hacker culture with access to microprocessors and a background of DARPA funded interface and networking research.

The rest was probably inevitable.

What's astonishing is how fast it happened. Most of the core ideas - laptops, databases, the web, GUIs and interactivity, distributed processing, networking, 3D graphics - appeared between 1958 and 1968.

There's been very little genuinely new since then. Most of what's happened has been faster and cheaper, but not so truly innovative.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 09:15:00 AM EST
[ Parent ]
It would be interesting to compare commercial to military / government revenues over time. I should study the topic further because it occurs at the intesection of several topics I'm interested in.

The early commercial viability of mainframes is a good point that I managed to forget. I'll still make my vague 20 year claim, though.

I agree that it all happened shockingly fast.

Most of what's happened has been faster and cheaper, but not so truly innovative.

I disagree. Reading IEEE mags since I became an EE major in college, there has been some stunning work over the years in the semiconductor physics realm that has been required to get the commercially viable transistor sizes we have today. From the computing point of view, though, I agree what you're saying.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 01:22:41 PM EST
[ Parent ]
There was once a plan for a really huge (I think 80 km) collider in the USA. They had already hired 2000 people. Then the program was canceled. Quite a number of these scientists made their way into finance and made complex derivatives a lot more popular.

So without CERN Europe may have simply more financial ABS, CDO, CDS, SIV soup. In some countries this counts as productivity increase, but on this I prefer the Internet.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 09:21:12 AM EST
[ Parent ]
Yeah, this guy was canned in 1993. The media at the time was promoting it as the perfect example of government waste.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 01:24:34 PM EST
[ Parent ]
An interesting episode in the killing of the SSC was how, during the Congressional hearings, a prominent physicists (someone big, like John A. Wheeler or Murray Gell-Mann or Steven Weinberg) was asked by a congressman whether the LHC would provide evidence of the existence of God. The negative answer did not help.

We have met the enemy, and he is us — Pogo
by Carrie (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:05:41 PM EST
[ Parent ]
Nice try, but you didn't get the second LHC->SSC.
Deleting your own comments! I see what the frontpagers are up to now... For shame!
by someone (s0me1smail(a)gmail(d)com) on Fri Feb 22nd, 2008 at 03:09:24 PM EST
[ Parent ]
I was only 16 at the time and not particularly into current events, but it was clearly a political circus if there ever was one.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 03:10:18 PM EST
[ Parent ]

Display:

Occasional Series