I slept really poorly because of the noise caused by extreme wind and rain. For some reason, the worse I sleep, the more my brain tends to obsess over complicated topics, and the higher the urge to write rants about them. This time, I couldn't resist.
The relevant world events for today's rant include:
Discord deciding to roll out mandatory age verification globally, for accessing certain platform features and content deemed "unsafe," essentially treating everyone as kids until they use relatively invasive methods for attesting otherwise.
My country deciding to ban "social media" for those under 16 (but really, it's going to apply to basically every communications service) - necessitating widespread rollout of age attestation systems, of course;
The news that the European Commission is moving to "ban infinite scrolling," which, depending on how the regulations turn out to be worded in the end, could either be dangerously vague or actually quite well targeted.
These are all moves which I find difficult to disagree with, at least when we choose to believe that they are disconnected from any hidden agendas. But I wouldn't be the first to point out how dangerous the "think of the children" slippery slope is, nor just how much more metadata the different age attestation mechanisms leave in the hands of all parties involved - even in designs that promise secrecy - and I will jump straight over that.
I started messing with computers at a very young age, but not so young as the age at which many children these days seem to end up with a smartphone or tablet in their hands. In fact, it's been twenty years since I took my first steps in understanding how to program computers, and I'm not even 30 years old yet. Meanwhile, today it seems that many eighteen year olds might not really know what files and folders are. Will they please get off my lawn!
My personality has always been one to care, probably too much, about other people's expectations for me, and when I was a kid, that mainly consisted of adults' expectations. "Doing no wrong" was so ingrained into my mind that, despite my proficiency at such a young age, I feel I definitely did way less "black hat type activities" than most kids with such interests at the time... perhaps apart from developing custom low-level software for my graphical calculator, which me and my colleagues totally didn't use in exams - but it's still such a mild thing.
The notion that some software itself could be illegal by itself, was not on my mind during my earlier formative years, possibly because none of the adults who mentored me had it in their minds either. Yes, I understood that when people were using eMule to download shitty MP3 rips, that was questionable - even if the local law, I went on to find out, is surprisingly amenable towards significant portions of that activity. But I never thought of eMule itself as something bound to be banned by law.
Looking back, I think that was also the understanding of the majorities in charge in many democratic countries - and whenever politicians started to present ideas to the contrary, a bunch of experts and citizens' initiatives would be quick to explain/dazzle them with the technical intricacies of how it is perfectly legal to torrent Linux ISOs. The 2000s were still very much the wild west for the internet and associated regulatory frameworks.
Yes, I'm definitely aware of what happened with Napster, but the more decentralized solutions that followed the demise of Napster seemed mostly immune from legal prosecution, at least as far as the software projects themselves was concerned. To this day, BitTorrent (the protocol) is still legal in most countries. Will decentralized social networking protocols fare as well?
I definitely feel like an old man with rose tinted glasses as I ponder, whether most societies were perhaps better off when technology, and especially the internet, and especially the giant platforms that are "the internet" to the vast majority of users today, weren't as pervasive. The current pervasiveness and outright critical economic dependency on these platforms has made it so that they shape societies; governments, perhaps correctly, feel the irresistible urge to regulate it all. It's probably a better outcome than letting governments themselves become obsolete at the hands of the giant private platforms; some even argue it is fundamental for democracy not to fall apart.
I think I was around 16 years old, specifically when deciding on an open source license to apply to my embedded software for graphical calculators, that I first came across the topic of export control laws in relation to software. That must have been shortly before my first visits to the Wikipedia pages on illegal numbers and DeCSS. Despite the evidence, right in front of my eyes, I was struggling to wrap my mind around the idea that certain software, numbers, math could be outright illegal, and/or the subject of regulations.
At the time, the idea that computers could be used to do nefarious things was definitely not alien to me. Nor was the notion that lying to a computer could be a crime - typically some form of fraud. All of that already sit very well in my mind. I've always seen software as the tool, and ultimately the responsibility would fall on those operating the machines. Why did I find the notion of illegal software very aggravating, then? I suppose that, at the time, I didn't realize that the possession and development of some physical tools is outright illegal in many jurisdictions, too: firearms come to mind as an obvious example. But software, and math, feel closer to knowledge, not a physical tool.
In the ethics course that was part of my Bachelor's degree, both the professors and the entire course were a bit of a meme. One notion they presented was that our professions, in information technology in general, were bound to become more regulated over time, much like other engineering disciplines. I found this notion a bit revolting, too; I dismissed it much as if it was part of the "meme." I suppose I had really enjoyed the sense of freedom that computers had brought me over the years, even from the point of view of a white hat hacker.
To think that such freedom could eventually be at stake, that I could one day be penalized for designing a UI a certain way, like a civil engineer would suffer consequences for badly designing a bridge, felt uncomfortable to me. However, look at the world around you: everything is increasingly software. There's an increasing amount of software whose behavior could harm more people than a collapsed bridge.
I definitely feel like an old man with rose tinted glasses as I ponder, whether most societies were perhaps better off when technology, (...)
I think many people are still stuck on this notion, that I too had, that math can't be illegal. To this day I still find the idea of regulating software generally revolting, even if I can reason through all of this and begin to understand why things are the way they are. Still, let's just say that it isn't just the rain and wind that have been preventing me from sleeping properly.
These days, I realize that anything can be illegal; those who hold the power of force ultimately decide what gets to be illegal, and they can make the most deranged decisions to make literally anything illegal. Next week, the democratically elected government of my country could decide to make it so that people can only use text editors for one hour per day and that video games are banned, even if none of our representatives campaigned on that. To abide by the law, citizens wouldn't have any choice but to comply, or I suppose they could flee the country, if that were to be still allowed.
The notion that math is somehow immune from tyranny is one held probably only by the citizens of (formerly) progressive societies. In the past, we would look to more restrictive, authoritarian societies and everyone would rightly label their tyranny as wrong. Now, it feels we are gradually copying the patterns from those societies we used to criticize, in part because in some of us are outright voting for it, in part because there are rarely any repercussions when politicians seemingly vote against the views the populace expected them to represent.
By the time Tornado Cash went down, it was still slightly revolting, but nearly not as surprising to me. Note that I never thought that enabling money laundering and other illegal activities was a noble goal. When I say revolting, I mean in the sense that it once again shattered this (incorrect) idea I somehow continue to deeply hold, that software is somehow "pure," "untouchable," that it should always be allowed to exist by itself, and that developing any kind of software should always be allowed. Whenever that notion is proven wrong, I still become a bit angry.
It's the 2020s and "AI" is here. There are many ramifications to all of this, in a world where software begins to have what could be considered free will, as far as many problem spaces are concerned. (Turns out "free will" might as well just be a stochastic process.) To regulate AI will mean to regulate software further, but it has already began, it is an inevitability, and lots of people are asking incessantly for more of it already.
Parts of me will likely feel angry as AI gets further regulated, while other parts will feel relieved. Regardless, this is all happening too fast to be figured out in any democratic, sane capacity. I do think we are entering an age of darkness as we struggle to reconcile the technological progress, and the concentration of "technological power," with the way our economies and societies work.
I struggle to think about how I would raise kids in today's world, especially when it comes to the very areas I work on. Proper parenting has always required a lot of effort, and I think this continues to be the case when it comes to teaching kids to safely navigate technology, and to keep an eye on what they are up to. However, it's undeniable that virtual spaces are more vast than they ever were, and with all the "algorithms," they can't be navigated linearly anymore.
The truth is that most parents never had the time nor the means to parent and educate to a great degree on their own, so societies organized such that kids go to school, as a complementary measure. You would therefore expect schools to also teach digital literacy, since the parents seemingly don't have the means for that.
The problem is that nobody knows how to educate for these things at this point. Technological progress is outpacing educational system progress. What was being taught in terms of digital literacy while I grew up, is so outdated, that parts of it are probably even bad advice now. Reminder: it's 2026 and text with typos on a random website you never heard of, is more "organic" and "real," and might be more correct, than many bot posts on social media - or even on news websites! The website of your bank no longer has a giant green field on browsers' address bars. What's an address bar, anyway?
With such fast-paced changes, when it comes to the age assurance debacle, it's difficult to blame the parents for everything. While my initial reaction was to think "why don't parents just parent better?" I eventually realized, I also wouldn't know how to parent better. School hasn't prepared me for any of this, most people don't have sufficient digital literacy to teach it either, "specialists" are outpaced every time a new impactful invention like ChatGPT appears. I might think I have a good mentoring strategy now, and two years later, my assumptions regarding how technology would impact my children would turn out to be wrong, and "the damage" could be done.
Note that this debacle goes much further back than phones in kids' hands: just a couple decades ago, Concerned Parents™ were debating the effects of TV on kids. TV sets that could just have been put out of reach of kids... yet few parents were strict about that. Adults liked watching TV, too, and today, many of them like mindlessly scrolling on their phones too.
Regardless of what the future holds, one thing is clear: my teenage hobbies, the way I got into my career, none of that will be available to my successors in the same way. I was creating accounts on forums about web hosting when I was like 13, without informing my parents, for crying out loud. I definitely feel like an old man with rose tinted glasses as I realize that, the way I did it, won't be the way anyone's kids will ever do it in the future. Maybe I shouldn't be against legislation just because it further moves us away from the way things were when I was younger?
Societies are starting to believe that social media, or the way it is offered by the entities that operate the most popular platforms, is not good for their teenagers, and probably not even their adults. But nobody can fully grasp the specifics of why, even if strong theories exist. There appears to be a lack of courage in seriously regulating the platforms themselves (and I am fully aware that would mean regulating software), likely because nobody can tell with certainty which aspects are damaging, and over-regulation is problematic too. Most importantly: nobody wants to send away the toy that they enjoy too.
Regulators seem to want to treat social networks, and perhaps even internet communication at large, as things that are damaging in themselves, like tobacco or alcohol is, without pausing to consider whether it isn't more about the particular aspects of the social networks that most people are hooked into. Except... these news about banning infinite scrolling might actually be a step in the right direction; I say as another part of me screams "have we gone so low as to regulate UI design?!" Then I get reminded that the UI of cars is regulated, and we're probably physically safer because of that. Hmm.
To go back to a world where common folk don't communicate globally with "everyone" over the internet seems unthinkable; but are we so far off from that? China has had a great firewall since forever. Are we bound to also begin copying some patterns here? For the sake of public health... but who decides what's a healthy mind, in the end?
There is certainly a difference between software in its most theoretical form, and the offering of a service. The key difference is: knowledge and math concepts by themselves don't communicate; communication is what enables software to become a service, to become more than just math, even if it's just within the local realm of one's CPU and peripherals. Ultimately, this is all about regulating communication: between humans, between machines, and between humans and machines.
Regulating communication? Very much feels like "been here, done that;" there are reasons why "freedom of speech" has very different flavors depending on the country you're in. Has the internet changed something about the properties of free speech, or did we just willingly lose control over the speech we pay mind to, because the dopamine hits were just too good to pass on?
Privacy is about communication, too: it's the freedom to choose what we keep to ourselves versus what we communicate with others. It is not surprising that communication regulations would impact privacy. Right now, it feels like everyone is going to suffer through slightly more monitoring, because parents weren't given the means, including time and knowledge, to mentor their kids effectively.
Nobody will magically learn to use some technology properly just because they turned a certain age, even when being a certain age is a prerequisite to learn it. Hmm, for driving, we have lessons... I think I can see where this is going: license and registration, please. (Forget incognito windows, I suppose.) And much like some of my parents and grandparents learned to drive before they were actually meant to, I suppose I will be one of the elderly who learned to drive the internet before I was of legal age.
This recent wave of regulations is, ostensibly, meant to solve problems introduced by services operated by increasingly more powerful international entities. Yet, they seem to be having more of an effect on individuals' rights and freedoms, than they are introducing much of a challenge for the social networks, or an incentive to be less damaging. Individuals are giving up convenience and some privacy, presumably to help solve a mental health epidemic introduced by predatory services and poor parenting.
As all of this happens, incumbent service operators - even extremely ethical ones - will still be subject to the regulations thought with "big fish" in mind. Even for regulations that specifically apply to giant corporations, there will always be the issue that they were designed with the problems of the 2020s in mind. Fortunately, when AI takes over, it will be able to rewrite those in a timely fashion, colored by the training performed by... some powerful, international entity.
I am as torn by this new global wave of children protecting regulation as I was two days ago, but it was only yesterday that I realized it was going to affect me much sooner than I anticipated. The biggest change so far, is that I can no longer mock the British folks on my Discord server regarding their draconian internet laws, for now us Portuguese folks have some too.
It's difficult to present arguments against these regulations without sounding like a monster, and yet I highly doubt the ends justify the means; it feels like tackling second order effects, rather than root causes. These types of approaches certainly present a very slippery slope.
Now if you'll allow me, I'm going to play some Counter-Strike while logging into Steam still requires no callbacks through any government API.