Back to the future
Table of Contents
Keep evolving - strange loops

Perfect text forever
One of the major flaws in the religious texts is the aspect that very old texts, describing an antique society, will keep their meaning intact over the centuries, confirming that a godly text is perfect and remains perfect forever. Special mention for Coran, because, it is meant to be the direct declaration of Allah, and provides a perfect textbook.

This could be true for very reduced texts such as declaring: Be nice with your neighbors. Don’t waste water. Don’t take advantage of the vulnerabilities of others. Wash your hands with clear water and soap before eating. Play SF video game instead of hitting those you dislike.
If religious texts were limited with such declarations, these small declarations may be true at any point of time in human history. That would have been perfect.
However, they describe so much more, with limited knowledge from the historical context of their origin. It is just impossible to keep all of them as universal truth, cancelling centuries of accumulated knowledge.
Either believers must cancel some parts of their texts, either they try to fuzz with metaphors meanings, or they simply stick to dogmas, such as not understanding the legitimate case of multiple sexual orientations. This is a denial against all scientific, social knowledge and experience accumulated until now.
Let me clarify, I have nothing against believers. However, when a believer discriminates people, because of his/her/xx religious dogmas, without any serious self-reasonning, without any consideration for modern knowledge, then, in this case, we are in fully opposite positions.
This introduction let me illustrate, how hard it can be, by design, to produce something that must still be valid over centuries, assuming knowledge in 1000 years will be much bigger and more accurate.
Newton’s exception

The great research from Newton in mechanics was right but incomplete. That knowledge was good but not perfect. As aside note, however, nowadays, it is still important to learn physics with Newton’s laws, because in many daily mechanics contexts, those equations still solve many problems and will continue in the future.
There are still interesting updates from time to time :) have fun!
Technology as a living beast
What about technologies
Technologies are not neutral. The design describing what is permitted and denied, what data is requested, the hidden or not API features, all these, determine an interaction. Such interaction can favour a group of users, can discriminate others. Application can be designed to be interoperable or fully isolated.
Technologies are also described as fatal: Once a model is built, if your case doesn’t fit in the model, well sorry..come back you stick to the norm of the model. Lastname field is 13 characters long, but your lastname is 17 characters long? No…you cannot exist in our system. Computer system does not permit longer than 13. We value your comfort. Good Bye.

Technologies are developping very fast, in under a century, see how many tech products have been available to store music sound and be able to play it individually.
Datastore
Storing data is an interesting challenge. Let’s say you are a city hall department employee. Your role is to store citizens data forever, data must not be lost, even partially, and data must be reachable at any time. Assuming data volume will grow over time, as your city population increases.

Considering data store media changes over time, obsoleting earlier tech, how can you ensure data to be avilable at all time, and forever? You cannot argue that you don’t care about what will happen after your death. Your decisions, your tech orientation, will involve consequences after your job period.
Linus Torvalds, creator of Linux project, declared: Real men don’t use backups, they post their stuff on a public ftp server and let the rest of the world make copies.

But your data is not the source code of Linux OS…
Now you start to feel the complex challenge of long-term data store. How to keep rechability, compatibility over time?
First solution is to use open-libre technologies, so you can be sure, it will remain legal to use the technology, and all documentation remains available.
Software
Long-term is also a complex issue for software. But this will cause critical security issues if some users are keeping unmaintained software over long period of time.
In the highly complex embedded powerful computer that resides in your pocket, there are millions of lines of code. All of these use a complex depency chain with frameworks, libraries, utilities, configurations layers. At any place where we write code, there might be errors that can be abused into security vulnerability. The error alone can be harmless, but called within a special application stack can open an attack surface, realistically exploitable, or chained with another software bug to finally build a real security issue.
As application stacks use more and more different libraries of code, the risk increases.
All this results in a high cost of maintaining software for many platforms, for many different releases. However, in the user perspective, they are allowed to keep using a safe software. Here we arrive at a dilemna point. Software cannot be maintained forever for all versions. Users cannot keep unmaintained software.
In the libre-software world, providing source code can help encourage others to continue the software maintenance. But this is not a strict consequence, project can also be silently abandonned. At least, source code will be avilable for any required code fix, and license won’t limit the code change at any time.
Problem is not solved yet. There are numerous improvements to provide a better safer situation, but no full solution. Considering that COBOL is still a risky maintenance challenge. What will happen in the next 10 years XD ?
fundamental theorem of software engineering (FTSE): We can solve any problem by introducing an extra level of indirection (bare metal computer -> virtual machine -> cloud computing -> application as a service)
With parcimony, developing a layer of abstraction can ben very useful. UNIX semantics to interface primitives as all-is-a-file: open, close, read, write, copy, move, delete. It really helped to build a solid system architecture. From 1970 to 2025, we still focus on file descriptors, we can play with elegant netcat.
Energy Transformers

I explicitely say energy transformers, because we don’t generate energy (else as a star like the sun), and we don’t consume energy, we only transform it. I consider that detail important as it may lead people to really think that we can generate energy, such as all those stupid videos showing that they found free energy generation. Sorry…but Carnot theorem still applies.
Anywhere we involve energy transformers, we involve technologies for industrial applications. Let me just add pressure about climate change and earth critial resources, we end in the topic of energy transformers transition.

Transition means moves from a state to another, in the end, having fully left the initial state. Bur let’s add the human factor, and you get….well…accumulation of transformers, no replacement. Of course, it is great to see, solar transformers or wind transformers growth. However, due to global energy demand increase, we cannot stop legacy fossil transformers….yet! Evolution is very hard, we still critically rely on oil and fossil fuel.
Internet
A technology deployed over the world, actively used 24h per day, during the whole year, is a real case of a specially complex system to maintain and upgrade.
Let’s start by a success story: the s in http
Back in the days of the peaceful time of Internet. Securing the communications was not a hot topic, and most usage was mainly email (a real privacy failure), and the famous web, by using the http protocol, in cleartext over the network.

But, as in every fantasy tale, that peaceful climate had to face the lack of security features. Passwords where so shorts and shuge often guessable, as everything was in cleartext, sniffing few packets hughuge the login contents](https://www.wallofsheep.com/pages/wall-of-sheep).

This security nightmare led to several security mechanisms. Cleartext logins were prohibited, OpenSSH stood up.

Web sites started to migrate to the secure protocol version, https, but had to buy for a certificate. EFF huge contribution to a free, authenticated, automated https certificate, led to the current huge adoption as of today.

This was rather a major success and general adoption for a safer Internet.
ipv6
Upgrading core protocol of Internet is really a complex challenge, because of human factor. ipv6 has been developed in order to respond to the ipv4 address space exhaustion.
In the pioneer days of Internet, ipv4 seems clearly enough, and we were not aware of its direction, mobile computing dominance, network stacks on any kind of object.
I’ve been to numerous NOG conferences, announcing end of ipv4, RIPE meetings, heavily encouraging building ipv6 networks. In the end, many backbones, core networks adopted ipv6 fully. But for many companies, they stick to ipv4, and don’t budget nor plan to switch to ipv6.

ipv6 is not less, not more secure than ipv4. It is different and care must be taken for security features such as link local privacy, router advertisement filtering, neighbor discovery properties.
Let’s all of us move to ipv6.
phone number
We live now in 2025, guess what, we still rely on the phone number scheme.

Most telephony of our epoch are voice over ip. Billions of users use Whatsapp, Signal for messaging and voice/video calls. There are no dependencies anymore to rely on the old classic phone number. Modern privacy communications applications such as Simplex don’t rely on phone number.
Why do we still rely on this old concept of circuit switched network, while Internet is a packet switching network? There have been ambitious projects to set new phone directory such as enum or dundi. A VoIP user account (SIP) follows same scheme as email address, so we could use that scheme. It’s much easier to recall an email address than a suite of digits for us humans.
Hopefully we will find an alternative to phone numbers. Who knows…
Internet fails

Security and Privacy: building of tcpip protocols didn’t anticipate at all the security and privacy need for serious protection of the data and metadata. ipsec exists, but is a general fail in ipv4, and a general fail in ipv6. ipsec is only used on a case by case for specific compatible configurations. There is no serious solution. The only efficient projects build an overlay network on top of it.
Internet cannot be fixed, even though, there are serious improvements such as TLS1.3, and future releases that will fix sni open issue, its core design cannot protect sensitive metadata.
Email cannot be fixed, its core design is privacy weak and multiple addons won’t solve the issue.
Lesson learned

Be very modular in what you build. Replacing or updating modules should not be a critical issue.
Prepare for new cipherscheme. We now mainly use ECC for PKI and AES for symmetric encryption. We prepare for post-quantum-crypto, few serious candidates have already been highlighted, but might change in the coming years.
IPFS multihash provides a solution to evolve with potential new hash schemes integration, self-describing hash format.
Reticuum: a one-line code change can upgrade the Reticulum address space all the way up to 256 bits, ensuring the Reticulum address space could potentially support galactic- scale networks
As I feel concerned with climate change, resources sharing, minimalism, parsimony, earth pollution, I’m not blindy running in the consumerism race.
I’m towards low-tech, and high tech that fit a need, not less and not more. The web has gone to such a bloated-ugly-surveillance-cosmetic darkness, that cookies annoyance is still there, gdpr does not solve, and the contents, if not generative AI generated, is less and less instructive. This is why I much more into a minimalist, secure, fast protocol such as Gemini.
Smaller protocols, less attack surface, easier to focus on security and privacy by design. Parsimony also educates to focus on goals, concrete needs to fulfill. Smaller also means easier to maintain over long-term.
Long-term also means with end of oil system, end of abundance of electricity, end of exponential myths. Parsimony and minimalism are good companions to prepare for long-term technologies.
Reticulum, a full network stack, written in popular language Python, is a recent project that aims to build for long-term. Considering privacy as key concept and network interfacing power. Reticulum can interconnect many different kinds of physical interfaces, handle high latency transmissions.

I don’t know the future. But I know that our resources are limited: oil system will soon fail, mines for all required minerals in electronics are so big actors in pollution, and ROI for extraction is lower and lower while demand increases as more complex microelectronics requires almost all elements from medeleiev periodic table.
Believing in cloud growth forever is just not realistic at all, while all involved companies preach for even more bigger datacenter, even more disks, cpus, ram boards.

I’m interested in preparing for long-term communications networks. Build for inclusive communities at large first. Network models that only focus on rich occidental countries is not interesting, first because i care worldwide, second because who knows what will become the conditions in rich countries in the future?
I don’t care about instagram real time video filters, but I do care about critical text communications in off-grid areas, or in places where all industrial networks have been lost for any reason.
Recall the example of the city hall employee? So how useful would be to have migrated to a super high-tech solution, but that become unusable afterwards?
As a hacker, make -try -better world.