The European Tribune is a forum for thoughtful dialogue of European and international issues. You are invited to post comments and your own articles.
Please REGISTER to post.
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN? Especially considering that the 1000's of smart people who work on these projects wouldn't dispappear, but would be working elsewhere, on other projects with potential spin-offs.
And on what project do you think these 1000s of smart people would have worked? Financial innovations? Micronukes? Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
As for HTML, why not invert that idea? Perhaps without CERN it would have been invented earlier... The point is that there is so little relationship between CERN's activities and HTML that it seems too strong to claim that without CERN, the WWW would have taken 5 years more.
After all, the Web depends not not just on HTML, but on a whole lot of interdependent technologies, both in hardware and software, that were growing in the 80's.
Particle physics had progressed so fast since the 1940's that the particle physics community had developed a system of "preprints" in which people circulated drafts of their papers to colleagues at their institutions and others months before they were published in journals. The story goes that Tim Berners Lee got tired of e-mailing documents back and forth to colleagues at CERN and decided to invent HTML and code a bare bones browser to allow him to (we would today way) webcast his research. There is something about the pace of information exchange within CERN and in the particle physics community that supports the idea that HTML might have taken 5 more years to be developed elsewhere (and it would have been some university or other: USENET and the text-based tools to go with it, and GOPHER, developed in that environment).
The large particle physics laboratories do employ thousands of physicists, engineers and programmers specifically for particle physics experiments purposes, and that is a nonnegligeable fraction of the respective academic communities. If the large labs didn't exist these people would be competing for academic jobs elsewhere and it would result in more people going to industry, as well as fewer people getting doctorates.
If LHC funding hadn't gone through, CERN would have stagnated and maybe shrunk. You need far fewer people to run the existing facilities than you do to develop a new facility, and the LHC research programme is much more intense that what can be carried out at the existinc facilities (not that that isn't useful, too, but it's on a smaller scale in terms of people and resources).
Consider CERN and the LHC a Keynesian stimulus package for physics and engineering. We have met the enemy, and he is us — Pogo
After all, it is possible to predict in hindsight that CERN would be perfect to develop a useful hypertext sytsem. But if one wants to use the unexpected, unpredictable benefits of a project as one of the arguments for funding, there has to be a rationale why this particular project or field is especially likely to lead to unexpected benefits.
So, big science drives technological developments in established fields, as well as occasionally resulting in new technology. [I distinguish two basic modes of technological progress: secular improvements in technology and new technologies - only the latter qualifies as "innovation" IMHO, and that is not predictable in the way that one can use, say, Moore's law when designing the specs of a computer system to be deployes 5 years in the future.] We have met the enemy, and he is us — Pogo
I would argue it's innovation all the way through. Some improvements change a subfield, and from the outside it looks as gradual, expected improvement. Some change a field, and the outside world can notice it and say it's something fundamentally different.
The difference between the dynamical systems we are used to considering in physics and biological or economic evolution is the possibility of the system of differential/difference equations changing dimensionality in response to processes within the system itself. We have met the enemy, and he is us — Pogo
But in reality, new products/inventions, even improvements on existing ones, are usually not that simple. They add an extra dimension, more freedom to find better solutions to problems. But in a high-level, low dimensional description, this freedom can be collapsed into a change in parameters, or really added as extra dimension, if the effects are important enough.
Funny thing is, I am currently working on shape optimization, where it is completely natural to change the number of parameters used to describe the shape, and thus the dimension of the problem.
A related field is order reduction, where you try to (locally) approximate a physical phenomenon by its most important modes. If there is a change in the physics, you can either modify the modes, but keep the same number of them, or you might find that for the new situation more modes are required to describe it well enough.
I would suggest this is a good analogy for your innovation/improvement distinction
I am familiar with dimension reduction (proper orthogonal modes, principal componets, factor analysis...) and you're right, at some level the number of variables is a matter of choice. But you still have to be able to close the system of equations. You can always ascribe the effect of all the neglected modes to "noise", though. We have met the enemy, and he is us — Pogo
One of the main things coming out of Apollo etc IIRC was the computer development of chips for the project. Large advances in microchips and material science filtered out to the outside world. Whilst industry might have got there as well, I'd say almost certainly it would get there slower, due to the very nature of business - a business looking at short term profit is far less likely to allow their researchers the time and space to create a bigger, more long-life project with its associated spinoffs.
Early research is expensive mainly because you don't know what the right solution is - it could be any number of different options and until you pick it you don't know, so there has to be a lot of investment without too much pressure on results immediately or in every route as a lot of them will be blind alleys - but without checking, you'll never know whether they are the right one or not.
But the fact is, the Apollo program was a one-shot thing. It was wound down and the US lost its ability to fly to the moon. It also discontinued its high-payload rockets in favour of the Space Shuttle, so now the rocket market is occupied by the European Ariane and the Russian Proton.
The Soviet manned space program made more scientific and technical sense than the American one, and the ISS owes more to the Russian Soyuz and Mir than to the American Skylab, which was also discontinued and folded into the Shuttle. We have met the enemy, and he is us — Pogo
One sidestory that I found particularly intriguing was a note between, I think, McNamara and Lyndon Johnson, in the early 60s. In it they discuss the budget surplusses they are expecting for the late '60s, and they fear Congress will call for tax reductions before they can use the surplusses for their Great Society plans. So in the meantime they think Apollo a good and popular method to keep the budget balanced until they have better things to do with the money. Then came Vietnam...
But more on topic, the whole 'spin-off' concept seems pretty much invented by NASA in later years to justify their budgets, and it is used for so many 'big science & engineering' projects when the wider public has doubts about the costs.
(2) Missile programmes you say? Because military programmes with large destructive potential are soooo useful while high energy physics, space exploration and the like are vanity projects! And you know how the military loves to share the technology it develops and would not like to keep it secret. One of the great advantages of the large lab high energy physics environment is exactly that it is not a military programme. We don't build things to kill people, I think this is a plus. Further, there is not a tendency to keep progress secret. Quite the opposite in fact, thus a greater chance that progress made here can defuse faster and wider.
(Disclosure, I work at CERN.)
There's very little evidence to suggest that Apollo contributed directly to electronic design. The first patent for single-chip microcircuitry was granted in 1958. Computers with modular logic were built around the same time. Putting the two together was an next obvious step, and would have happened anyway.
Apollo was mostly a PR exercise for US science and engineering. There may have been some spin-offs in materials science and - obviously - rocket science. But Apollo hasn't left much of a trace in computer science history.
In fact it's rarely mentioned at all. Projects like SAGE, which was the first generation US air defence network, were much more important. NASA did buy a big IBM System/360 for Apollo, but System/360 was already around, and IBM were more interested in selling it as a tool for airline bookings than managing a space program with it.
Keep in mind that the big advantage of ICs, even those years, was the possibility to get prices down through mass production. Not really something the space program, or even Minuteman was very concerned about.
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN?
That's a really hard question to answer. HTML didn't happen directly because of CERN, but it happened because CERN was an environment in which a quick mark-up system would be instantly useful, and because there was no need for 'research' to invent anything more complicated.
There were many, many alternatives to HTML, including horrible things from academia that are best forgotten.
I know people who were researching them, and while they were often better than HTML in many ways - e.g. no broken links - they were also wretchedly overcomplicated, with limited public appeal.
So HTML might well have never happened in its current form. We could easily have had some kind of Windows-ish or other system of gargantuan complexity and slowness.
If you look at academic vs 'public' computing there's a clear pattern of highly abstracted command line tools in academia (e.g. Latex), and much simpler WYSIWYG colouring-book computing in the public area.
HTML broke that pattern by doing something script-ish but relatively simple inside academia, which subsequently escaped into the wild.
That hasn't really happened before, which I think means it's not something that could be relied on.
Or in other words - it's likely CERN got lucky.
Had that scenario not occurred the industry would not have existed until advances in other sciences / industries would have created commercial viability for it indirectly.
you are the media you consume.
Mainframe computing was established by the late 50s, and mini-computing was just starting up. The market was already worth $billions by then. There were some prestige military projects - e.g. SAGE again - and a lot of DARPA funding for research. But the civilian market was already huge, with its own momentum.
Once TI introduced TTL logic in the early 60s, computers became a lot cheaper. At the same time a strong hobbyist culture fanned by magazines kept interest in technology running very high, so there was a steady stream of wannabe engineers with experience of digital techniques from their early teens.
Microprocessors were already being planned in the mid-60s. The biggest gap was between commercial computing and the microprocessor market, and that was bridged by developing a general purpose microprocessor and putting it into a commercial product - a dekstop calculator. It wasn't a military project.
Now you had hobbyist/hacker culture with access to microprocessors and a background of DARPA funded interface and networking research.
The rest was probably inevitable.
What's astonishing is how fast it happened. Most of the core ideas - laptops, databases, the web, GUIs and interactivity, distributed processing, networking, 3D graphics - appeared between 1958 and 1968.
There's been very little genuinely new since then. Most of what's happened has been faster and cheaper, but not so truly innovative.
The early commercial viability of mainframes is a good point that I managed to forget. I'll still make my vague 20 year claim, though.
I agree that it all happened shockingly fast.
Most of what's happened has been faster and cheaper, but not so truly innovative.
I disagree. Reading IEEE mags since I became an EE major in college, there has been some stunning work over the years in the semiconductor physics realm that has been required to get the commercially viable transistor sizes we have today. From the computing point of view, though, I agree what you're saying.
So without CERN Europe may have simply more financial ABS, CDO, CDS, SIV soup. In some countries this counts as productivity increase, but on this I prefer the Internet. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
by Frank Schnittger - Dec 3 2 comments
by Frank Schnittger - Dec 2 2 comments
by gmoke - Nov 28
by Frank Schnittger - Nov 21 10 comments
by gmoke - Nov 12 6 comments
by Oui - Dec 5
by Frank Schnittger - Dec 32 comments
by Oui - Dec 214 comments
by Frank Schnittger - Dec 22 comments
by Oui - Dec 26 comments
by Oui - Dec 112 comments
by Oui - Dec 14 comments
by Oui - Nov 306 comments
by Oui - Nov 289 comments
by Oui - Nov 276 comments
by gmoke - Nov 26
by Oui - Nov 268 comments
by Oui - Nov 26
by Oui - Nov 2513 comments
by Oui - Nov 2318 comments
by Oui - Nov 22
by Oui - Nov 222 comments
by Frank Schnittger - Nov 2110 comments
by Oui - Nov 2120 comments