thinking is dangerous — it leads to ideas
President of the Board of the Polish Free and Open Source Software Foundation. Human rights in digital era hacktivist, Free Software advocate, privacy and anonimity evangelist; expert volunteer to the Panoptykon Foundation; co-organizer of SocHack social hackathons; charter member of the Warsaw Hackerspace; and Telecomix co-operator; biker, sailor.
The Internet is a very young invention — its precursor, ARPANET, was created in the early 70s. Most of Internet phenomena we partake in and tools we use are much younger. First social networks, for example, were created in the nineties, however the boom for the came only in the first decade of the 21st century.
That means, basically, that we do not know how to use them. Simply put, we had no time to learn.
People, naturally, are trying to work in the new reality, in new situations, with new technology by analogy to situations and technologies they know. Only after such practical attempts (often lasting for years) does the understanding of how the new tool differs from the old and well known ones arise. And with that — new regulations, new customs ordering up the use of such new technology.
When first "automobiles" showed up, they were treated more or less as carriages, a technology known for ages. It soon, however, turned out that cars are much faster and hence much more dangerous; this lead to new customs and laws, and a whole new culture of using cars, that accounted for the "otherness" of this new tool and new reality.
At first, part of the regulations was absurd. For instance, red flag laws stating that each "mechanical carriage" should be preceded by a person with a flag or lantern, warning about machine's approach. With time, though, people learned how to use cars, and how to regulate its use — and the new tool gradually stopped being new.
Finally, the tool became familiar, cars are now an ordinary part of everyday life; we all more or less know and understand the rules — like having to look around while crossing the street or using seatbelts while in a car. Some of such rules entered common knowledge and customs; some were made into laws. In general we all know how to behave in a world with cars — something our grandparents knew not.
With Internet, social media and the rest of the information technologies (called quite recently — unsurprisingly — "new technologies"), we are in a similar situation to our forefathers in the first years of motorisation. The technology changed, and it changed our reality. We do not know and are unable to foresee all consequences of our actions in the virtual space. Rules, the culture of using the Internet and laws pertaining to it — are only being created.
In many ways our situation is actually much worse. Technology changes much faster than these few decades ago; this means that it's much harder to catch up with cultural changes, with customs and regulations. It gets even worse: possible ill effects of our inability to foresee all consequences of certain actions are significantly delayed and not as spectacular, as consequences of not noticing a car approaching (although not always).
Hence, it's harder for us to notice such ill effects, take them into account and verify or change our customs, our internet hygiene. If consequences of my bad decision turn out only 5 years from now on some job interview I fail due to some pictures from a party, published 2 years ago — during these 7 years I am unaware of the fact that putting those pictures on-line was a mistake. And there is a huge chance I will make this mistake many, many times during this time.
Not only does the technology change faster than ever, we are reacting to those changes (through making changes to our customs, culture, laws) slower than before. This is extremely dangerous — but we will make notice of it 5 or 10 years from now, when today's teens will be trying to get their first jobs, while their prospective employers will verify them with the help from Uncle Google and Big Brother Facebook.
This is precisely the reason why some organisations and people (including myself) are warning about giving up privacy (of which citizens are not always aware), about giving our personal data and private communication away to corporations and centralized networks.
It is, of course, possible that we are overreacting — like the proponents of "red flag" laws. However, it is often better to be a bit overcautious — so that we're not all engulfed in technocomplacency and blind fascination over new technologies.