Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.
Display:
I'm not really thinking of a set of standards, like ASCII, but rather a national institute whose job it is to write a solid, up to date, secure and reliable OS.

Appalling as the current corporate mess is, I think a government managed mess would be even worse.

The problem is that computer science is not like physical engineering. Bridges and nukes have obvious failure paths. If something falls over and/or explodes and/or kills people, someone has failed. The science is relatively stable, and the modelling tools are advanced enough to have some obvious relationship with reality.

There is no equivalent concept of modelling in Comp Sci. So there is no such thing as provable reliability, or provable security. The last time someone tried to create a reliable, secure, system, they made something called ADA, which is pretty much a joke.

A variety of formal methods have come and gone, but most are based on managerial ad hoc-ery - like Agile, which includes super-sophisticated techniques such forcing programmers to work in pairs and take turns on the keyboard while the other says 'Hang on, that line isn't a good idea.'

That's where comp 'science' is. It's really just a lot of people throwing things at the market and hoping some of them stick, with the added occasional of marketing and evangelism (including open source) for extra adhesion.

So the idea of a 'provably reliable and secure' system is - optimistic. Given the state of the art, no such thing can exist. Hackers are enormously inventive, and if you give them nation state level supercomputers to play with, they can crack pretty much any security that isn't nailed down with physical rivets and air gaps. (And even then everyone knows you can crack that by bribing people.)

As for better web and email - yes, but, you'd have to rewrite all the email and web software in the entire world.

Just because something can be improved technically - and there's plenty to dislike about email and web software - doesn't mean it can be improved in practice.

Put all of those points together and you have systems that can't be modelled accurately, can't be proven to be secure and reliable, are perpetually being tested for exploits by some very clever people, and can't be replaced with something better for practical reasons.

Or in other words - good idea, but no.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Sun Jun 9th, 2013 at 08:02:41 AM EST
[ Parent ]
Heartily agree.  The following is intended as expansion on two points, not a criticism of TBG's post.

Re: CompSci

A good example of numbnuts things Computer Scientists come-up with is their horror of unrestricted GOTOs.  Well, I've got news.  If all unrestricted GOTOs were eliminated nobody could boot their computer.  If unrestricted GOTOs were eliminated wouldn't have Assembler level Branching Instructions.  If unrestricted GOTOs were eliminated have to drop the "L" in ALU.

The CompSci folks have very, very, carefully isolated CompSci from Reality.

TBG:

... you have systems that ...  be replaced with something better for practical reasons.

Nothing is going to change until somebodies derive a "More Better" way of doing things in degree and kind.

The three biggest reasons are:

  1.  The total global market for Computers is roughly US $900 billion/year when everything from clam shell mobiles to 900 tera/flop machines is added together.

  2.  The installed base: hardware, software, "mental-ware," and hardware/software/"mental-ware" union, is both vast in terms of money and non-monetary asset calculation.  I have no idea what these work out to in dollars but US $200 trillion is probably conservative.  

  3.  A challenge to the above that is not substantial "More Better" in degree and kind is fiercely resisted, a current example is the outcry over MicroSoft Windows 8.0.

As a Computer Engineer I look at the awe striking advances in hardware technology and am disgusted by the lack of advances in software.  


She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Sun Jun 9th, 2013 at 11:59:25 AM EST
[ Parent ]
The observation was that using gotos generally makes a program much harder to understand.  It's the programming equivalent of 'you halve your readership for every equation you use'.

People have tried sneaking them back in the guise of callback based programming, and that results in programs that are even harder to understand.

Yes, they are used at the cpu instruction set level.  But almost nobody writes (or even understands) that.

by njh on Tue Jun 11th, 2013 at 12:34:12 PM EST
[ Parent ]
And Branch instructions.

Good Flow-of-Control design eliminates 95% of unrestricted GOTOs.  The other 5% are unavoidable.  I can't help the fact "almost nobody" understands the instruction set for the CPU powering the machine they are programming and for all the Demons of Stupidity that flow therefrom.  The solution for ignorance is to learn.

(Hey!  You kids.  GTFO my address bus.)

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Wed Jun 12th, 2013 at 01:21:36 PM EST
[ Parent ]
The solution is to use INTERCAL which disallowed GOTO, providing the COME FROM statement instead.
by gk (gk (gk quattro due due sette @gmail.com)) on Wed Jun 12th, 2013 at 06:56:06 PM EST
[ Parent ]
And the far more troubling COMPUTED COME FROM.
by njh on Thu Jun 13th, 2013 at 12:42:17 PM EST
[ Parent ]
I think modern architectures are simply beyond most humans.  Can you even explain what the carry less multiplication instruction is for, or how the branch prediction system works?  And is it worth having all that understanding when you'll be writing programs which are mostly just moving bytes backwards and forwards.
by njh on Thu Jun 13th, 2013 at 01:10:05 PM EST
[ Parent ]
Going all the way back to the beginning, architectures were beyond most humans.  Most humans have no interest in computer architecture so they don't undergo the grind to learn.  

There's no Law stating everybody has to learn computer architecture.  However, when the discussion moves to Standards or cybernetic design the people involved damn well better have some idea.  Otherwise the thing will end up as a royal mess: LISA, Windows 8, etc.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Jun 14th, 2013 at 11:01:11 AM EST
[ Parent ]
And Gotos really fuck up verification/provability as well.
by Colman (colman at eurotrib.com) on Wed Jun 12th, 2013 at 01:37:23 PM EST
[ Parent ]
And maintainence.  I've done my share of trying to keep spaghetti code with a double portion of GOTO sauce from falling over.  Fun it ain't.

In a proper design GOTO is verifiable to the same extent any program is.  The trick is "proper design."  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Jun 14th, 2013 at 11:09:58 AM EST
[ Parent ]
Is it not possible to segregate program memory from applications memory and to sufficiently restrict access to program memory that web access is impossible - starting with 'cookies'? I can see that this might be inconvenient and might trash some currently beloved commercial practices but I would be happy to have only a few software changes per year that I installed via CD I received in the mail from a trusted source. Were this to drastically cut into the business models of Google, Yahoo, etc. they or their replacements could devise new ways to make money. Would work for me. After all, this is a Utopian thread. :-)

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Jun 10th, 2013 at 01:55:33 PM EST
[ Parent ]
H'mmm.  OK.  

First you gotta understand commercially available consumer computer architecture is stuck in 1975 using "mental-ware" thunk-up in 1956.  

Second, the security folks are brought in after the hardware, hardware/software, and software folks have designed and built their stuff.  And the conversation goes something like:

Management:  OK, make this secure.

Security Folks:  We need to make fundamental design changes x, y, and z to secure the system.

Management:  Can't change anything.  OK, fake it.

Security Folks:  ^#$%^&!

So ...

Is it not possible to segregate program memory from applications memory

Yes and No.

Yes in the sense everything needed is laying around waiting for someones to get off their asses and plug 'em together.  (The Raspberry Pi gives a hint of what is now possible.)

No because the Decision Makers in the Computer Industry are either technologically illiterate or technologically obsolete.  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Mon Jun 10th, 2013 at 04:03:02 PM EST
[ Parent ]
So computer insecurity is just another aspect of the present financialized economy where all decisions are made on the basis of the short term profits of the biggest economic incumbents - unless someone can find a way to make new, spectacular profits from some disruptive development. OR - unless governments or philanthropists make grants to exceptionally able individuals, such as pi suggests down thread, or some combination.

As ASKOD has noted, there is a large potential market for secure, reliable software to run important infrastructure programs, such as taxes, welfare, the entire medical records complex, vehicle registration and tracking, real estate records, etc. The problem, again, is the structure of the society and the role of finance, which, as Migeru notes, is to be the brain cancer of our society. Trying to get software that takes as its most important roles functionality and security, under the present paradigm, will reliably turn into a death match between politically powerful business entities - with a continuation of the current bungling incompetence with regard to functionality and security remaining the likely default solution.

That is a very different conclusion than TINA, which is what the first response seemed to be tending towards. The structure of our societies, optimized as they are for the maximum wealth extraction capability of the very wealthy, quite naturally make all aspects of our existence miserable. Perhaps the situation will self resolve with the current plague of viruses ending in the destruction of the ability to have any confidence in the existing ownership records within out private ownership societies.  

"It is not necessary to have hope in order to persevere."

by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Jun 10th, 2013 at 06:16:13 PM EST
[ Parent ]
So computer insecurity is just another aspect of the present financialized economy where all decisions are made on the basis of the short term profits of the biggest economic incumbents

The entire computer industry is just another aspect of the present financialized ... & etc.  Has to be.  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Tue Jun 11th, 2013 at 01:57:03 AM EST
[ Parent ]
And this, once again, is why I wonder if a completely different funding and production model for software should be given a shot.  If the majority of problems are due to past ignorance and present laziness, then focused attention and effort in the right circumstances oughta be able to accomplish something.

Fund people to do the actual hard work of rebuilding things the right way, without an expectation of profit in the near or medium term.

by Zwackus on Mon Jun 10th, 2013 at 08:55:54 PM EST
[ Parent ]
That would be more interesting than trying to enforce standards first and innovate around them later.

In fact there's a fair amount of research in Comp Sci (sic). There are even alternative OS models that are more interesting than anything that's been commercialised.

Problem is, it's not interesting to corporates and it would take too much time/money to make it economic.

What will probably happen instead is a new wave of stuff that's computing++ - a completely different hardware/software/philosophical model, completely new applications, and none of the baggage we have now.

Getting to there from here would be more interesting than trying to reinvent what's around today and make it the-same-but-better.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Tue Jun 11th, 2013 at 07:19:09 AM EST
[ Parent ]
I'm sorry, I guess I hadn't properly explained myself.  What I wanted to propose was . . .

1 - Establish a government body, and hire people to write good OS (for whatever device categories that need it - including buy not limited to home PC, network servers, database hubs, and the computer-bits that run industry and infrastructure hardware) and basic applications software, with the eye on the medium to long term, and with things like security and reliability built in from the beginning.

2 - As a side project, do basic research into things like software verification and whatever other basic things that we don't understand all that well, but which might be useful for the staff working on 1.

3 - When a bunch of the stuff starts to coalesce, think about standards based on the new stuff, and how to use them to bring everybody else up to par over time.

by Zwackus on Tue Jun 11th, 2013 at 09:41:08 AM EST
[ Parent ]
And I hadn't explained myself because a side-idea got stuck in my head in the writing, and went first, and obscured the bit that I thought was more important all along.  Grrr.

What I'd really meant by standards, at least when I was writing it, would be something less like ASCII and more like an objective way of measuring how secure a piece of software is.  I don't think there's really any way right now to formally state or measure something like that, and this seems like a problem.  Maybe it's utterly impossible, but it seems like it would be useful to have a proper security rating, that is properly testable, and legal restrictions based along it.  For example, anything that accesses the internet must get an 8/10 on the formal security scale, or something.

by Zwackus on Tue Jun 11th, 2013 at 09:45:49 AM EST
[ Parent ]
There are lots of security ratings. They're mostly useless or so time consuming and expensive to pass that they apply to previous generations of tech and can only be passed by the big corporates.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 09:50:06 AM EST
[ Parent ]
Well - that's been my point here. Such a thing is simply not possible given the current state of the art, no matter how much money you throw at it and how many clever people you hire.

Even if you devised a perfectly secure system - using quantum signalling, or something - there's still a key on file somewhere, or stuck on a postit note next to someone's desk. Etc.

Even if not, security services will demand a back door, which can be exploited.

Security is relative. Most security is non-existent. A few applications pretend to offer 'almost good enough', with hope rather than certainty.

All information has a market value, and if the cost of breaking security is higher than the value, you're safe, up to a point.

But some hackers like breaking into things just because they can. So 'secure' is pretty much meaningless in absolute terms, and certainly not something you can rely on with any confidence.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Tue Jun 11th, 2013 at 11:22:18 AM EST
[ Parent ]
As computing is a fairly new thing, in terms of human endeavors, there may still be a fair bit of wiggle room when thinking about what might be possible or impossible.  Throwing steady, full time employment at people and asking them to think about the problem may be a waste of time if all one is looking at is the final success of e project.  However, this sort of job creation program seems no more harmful or misguided than most, and worse comes to worse, the engineers and programmers so employed, and their families, and they people from whom they purchased goods and services, will have been better off for it.

And even if the project fails in terms of its main goal, its possible that something good may well come of it.  It's a heck of a lot more likely than putting people to work on weapons tech, where success is its own form of failure.

by Zwackus on Wed Jun 12th, 2013 at 12:52:12 AM EST
[ Parent ]
Fund people to do the actual hard work of rebuilding things the right way, without an expectation of profit in the near or medium term.
Government...

Finance is the brain [tumour] of the economy
by Migeru (migeru at eurotrib dot com) on Tue Jun 11th, 2013 at 08:16:16 AM EST
[ Parent ]
"Government..."... which so regularly has such spectacular failures while trying to buy custom software from private sector vendors. If 20% of the money regularly wasted on such fiascoes were put into an ongoing program to develop basic secure, reliable software for a range of governmental operations...we would blow a major hole in the private software business.


"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Tue Jun 11th, 2013 at 10:35:26 AM EST
[ Parent ]
ThatBritGuy:
There is no equivalent concept of modelling in Comp Sci. So there is no such thing as provable reliability, or provable security.

I was under the impression that using formal methods, one can prove the reliability of a program. Just being slow, expensive, and complexity arising sharply if it is run on a machine that also runs any other programs. So only being used for things likes nukes and such.

But perhaps the state of formal methods can be improved if a ton of money was thrown on it. This being a diary for utopian ideas ad all.

Sweden's finest (and perhaps only) collaborative, leftist e-newspaper Synapze.se

by A swedish kind of death on Mon Jun 10th, 2013 at 02:41:32 PM EST
[ Parent ]
See Formal verification. I remember some of this from an introductory programming course in my math degree 20 years ago. It was fascinating but evidently the amount of resources needed for algorithm verification of anything but the simplest algorithms was prohibitive. In addition, we only saw how this applied to simple procedural languages. Real, modern computer systems are full of event-driven programs, hardware interrupts, declarative languages, and what-not. Algorithm verification is probably only useful as a debugging technique, not for a-priori certification of real life programs.

Finance is the brain [tumour] of the economy
by Migeru (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 02:48:50 PM EST
[ Parent ]
Basically you can prove reliability if there's a finite number of program states and execution paths, you can specify the transitions through/between them, and then prove you're not in some kind of NP tar pit.

Which is fine as far as it goes. But - as you say - most software is event driven, and doesn't run like a sausage factory where stuff goes in, you wait a while as things happen, and stuff comes out.

So in practice you can't even completely specify the inputs, never mind define states/paths/transitions.

And then there are horrible things like natural language interfaces, where the semantics are inherently fuzzy and there is often no unambiguous meaning at all.

In fact I think there's going to be an eventual singularity in software development, where formal systems become smart enough to define improved formal systems. We'll probably get software proving that software sort-of works by the end of the decade, and certainly by the end of the 2020s.

Once humans no longer have to write code by hand, development will speed up a lot and become much tidier and more reliable.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Mon Jun 10th, 2013 at 03:16:37 PM EST
[ Parent ]
Formal Systems will never be able to verify formal systems until we can make an end-run around Gödel's incompleteness theorems.  Which, at the moment, is a lot like saying, "We can achieve FTL when we make an end-run around the Theory of Relativity."

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Mon Jun 10th, 2013 at 04:11:52 PM EST
[ Parent ]
That there are undecidable questions of arithmetic doesn't mean that every algorithm verification is undecidable.

Finance is the brain [tumour] of the economy
by Migeru (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 04:23:49 PM EST
[ Parent ]
I agree some can be.  

At a former job I verified a program's algorithm would accomplish the required task and then halt.  I also found it would halt after ~600 years.  Thus it wasn't Formally NP ...

& whoop-de-do

Computer Engineering is engineering.  As such it is different than the tools: Logic, Maths, etc., it uses.  Axiomatic Deductive Logic is a nice tool.  In all too many cases using only ADL is a sure route to, "How the %$^@#$! did we get into THIS mess?"

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Wed Jun 12th, 2013 at 01:36:44 PM EST
[ Parent ]
Axiomatic Deductive Logic

No, no it's not. Just no. Bleurgh.

by Colman (colman at eurotrib.com) on Wed Jun 12th, 2013 at 01:38:55 PM EST
[ Parent ]
We have to be gentle with Migeru.  His mind has been warped by Physics.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Wed Jun 12th, 2013 at 03:42:41 PM EST
[ Parent ]
was formulated explicitly for number theory. There are two parts (from 40 year old memory):

  1. Any theorem that is capable of proving as true only theorems that are true is not capable of proving as true all theorems known to be true.

  2. Any theorem that is capable of proving as true all theorems known to be true will also prove as true theorems known not to be true.

Of the types of theorems described the first category seems the more useful. Even though it was first developed for number theory it has not prevented us from doing accurate calculations. (Which was always my problem to begin with  :-)

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Tue Jun 11th, 2013 at 10:55:10 AM EST
[ Parent ]
In fact I think there's going to be an eventual singularity in software development, where formal systems become smart enough to define improved formal systems. We'll probably get software proving that software sort-of works by the end of the decade, and certainly by the end of the 2020s.

I'd say about fifty years.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 05:25:34 AM EST
[ Parent ]
Model checking is already useful for finding subtle bugs in reactive controller programs. But those methods are only used on safety-critical projects. Avionics, nuclear reactor controller etc.

However, the general verification/synthesis problem cannot be automatically solved (there is a proof).

Schengen is toast!

by epochepoque on Thu Jun 13th, 2013 at 07:11:23 PM EST
[ Parent ]
Donald Kunth, Frequently asked questions
Beware of bugs in the above code; I have only proved it correct, not tried it.
by gk (gk (gk quattro due due sette @gmail.com)) on Mon Jun 10th, 2013 at 07:07:52 PM EST
[ Parent ]
LOL

Finance is the brain [tumour] of the economy
by Migeru (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 07:27:18 PM EST
[ Parent ]
Formal methods etc is still pre-Newtonian, while software development is busy building the equivalent of relativistic starships (massive networks with feedback loops everywhere). Not a good match.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 05:24:47 AM EST
[ Parent ]
In my experience, formal methods become mature enough for industrial applications when the researchers involved decide they need to buy a house and settle down.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 06:01:45 AM EST
[ Parent ]


She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Mon Jun 10th, 2013 at 08:46:37 PM EST
[ Parent ]
Marketing & Sales: Fine, we'll put it out there and let the punters customers debug it.
by afew (afew(a in a circle)eurotrib_dot_com) on Tue Jun 11th, 2013 at 01:41:37 AM EST
[ Parent ]

Display:

Occasional Series