The European Tribune is a forum for thoughtful dialogue of European and international issues. You are invited to post comments and your own articles.
Please REGISTER to post.
1 - Establish a government body, and hire people to write good OS (for whatever device categories that need it - including buy not limited to home PC, network servers, database hubs, and the computer-bits that run industry and infrastructure hardware) and basic applications software, with the eye on the medium to long term, and with things like security and reliability built in from the beginning.
2 - As a side project, do basic research into things like software verification and whatever other basic things that we don't understand all that well, but which might be useful for the staff working on 1.
3 - When a bunch of the stuff starts to coalesce, think about standards based on the new stuff, and how to use them to bring everybody else up to par over time.
What I'd really meant by standards, at least when I was writing it, would be something less like ASCII and more like an objective way of measuring how secure a piece of software is. I don't think there's really any way right now to formally state or measure something like that, and this seems like a problem. Maybe it's utterly impossible, but it seems like it would be useful to have a proper security rating, that is properly testable, and legal restrictions based along it. For example, anything that accesses the internet must get an 8/10 on the formal security scale, or something.
Even if you devised a perfectly secure system - using quantum signalling, or something - there's still a key on file somewhere, or stuck on a postit note next to someone's desk. Etc.
Even if not, security services will demand a back door, which can be exploited.
Security is relative. Most security is non-existent. A few applications pretend to offer 'almost good enough', with hope rather than certainty.
All information has a market value, and if the cost of breaking security is higher than the value, you're safe, up to a point.
But some hackers like breaking into things just because they can. So 'secure' is pretty much meaningless in absolute terms, and certainly not something you can rely on with any confidence.
And even if the project fails in terms of its main goal, its possible that something good may well come of it. It's a heck of a lot more likely than putting people to work on weapons tech, where success is its own form of failure.
by Frank Schnittger - May 31
by Oui - May 30 10 comments
by Frank Schnittger - May 23 3 comments
by Frank Schnittger - May 27 3 comments
by Frank Schnittger - May 5 22 comments
by Oui - May 13 65 comments
by Carrie - Apr 30 7 comments
by Oui - May 3113 comments
by Oui - May 3010 comments
by Frank Schnittger - May 273 comments
by Oui - May 2725 comments
by Oui - May 24
by Frank Schnittger - May 233 comments
by Oui - May 1365 comments
by Oui - May 910 comments
by Frank Schnittger - May 522 comments
by Oui - May 449 comments
by Oui - May 312 comments
by Oui - May 29 comments
by Oui - Apr 30273 comments
by Carrie - Apr 307 comments
by Oui - Apr 2644 comments
by Oui - Apr 889 comments
by Oui - Mar 19143 comments