I haven't talked about my goal for personal computing in a while.

With Sundog nerdsniping me in to attempting to turn the LibSSH-ESP32 port in to the basis of a full fledged SSH client for the ESP32, I guess I should spend a few minutes talking about why I bother with this bullshit.

Computers could be good, but they aren't.

That's the gist of it.

I guess I mean Good with a capital G, as in "a force for good in the world", but I also mean good with a lowercase g, as in "not super shitty to use, or think about".

I'm not going to waste a lot of bits talking about how computers are bad. I've done this a lot before, and you probably already agree with me. I'll quickly summarize the high points.

What's wrong with (modern) computing?

- Computers spy on us all the time
- Computers are insecure, while pretending not to to be.
- Computers enable new modes of rent seeking, further exasperated by shitty patents and worse laws
- Computers/the modern internet encourage behaviors which are bad for our mental health as individuals.
- Computers and the modern internet, in concert with modern capitalism have built a world essentially without public spaces.

You know, all that bullshit.

As I said, it's a summation. There's nuance. There are more problems.

That list should serve as an okay shorthand for the kind of thing I'm talking about.

Computers? They're bad.

But I'm here, talking to you, through a computer. I derive my living from computers. I spend most of my free time in front of a computer.

In spite of all the ways computers are lowercase b bad, computers enable a lot of Good.

I believe in the potential of computers, in our digital future.

I've spent a lot of time thinking about what the next 30 years in computing might look like, the successes and failures of the last 30 years, and the inflection point at which a computer is Good Enough for most tasks.

I've spent a lot of time thinking about the concept of planned obsolescence as it applies to computing, and what modern computing might look like without the profit motive fucking everything up.

@ajroach42 Sounds like you'd enjoy World Wide Waste by Gerry McGovern!


I'm just a dude.

I'm a sysadmin. I spend a lot of time using computers, and specifically I spend a lot of time fixing machines that are failing in some way.

But I'm just some dude who thinks about stuff and imagines futures which are less horrible than present.

I've said that as a way to say: I don't claim to have The Answer, I just have some ideas. I'm going to talk about those ideas.

Sidebar over.

So how did we get from the gleaming promise of the digital age as imagined in the 70s to the harsh cyberpunk reality of the 20s?

Centralization, rent seeking, planned obsolescence, surveillance, advertising, and copyright.

How do we move forward?

Re-decentralization, a rejection of the profit motive, building for the future/to be repaired, building for privacy, rejecting advertising, and embracing Free software.

Personally, there's another facet to all of this:

I use and maintain computers professionally, and recreationaly.

Sometimes, I want to do something that doesn't feel or look like my day job. Unfortunately, most of my hobbies feel and look a lot like my day job.

To that end, I have some weird old computers that I keep around because they're useful and also because they're Vastly Different than the computers I use at work.

My , mac plus, and palm pilots fall in this camp.

I can do about 80% of what I want to use a computer for with my 486 based, non-backlit, black and white HP Omnibook.

Add my newly refurbished and upgraded Palm Lifedrive, and I'm closer to 95%.

Let's run through those tasks:

- Listen to music (The palm is a 4GB CF card with a 2GB SD card, basically.)
- Watch movies (I have to encode them specially for the palm, but the lifedrive makes a great video iPod.)
- Read books (plucker is still pretty great)
- RSS (ditto above)


- Email (via some old DOS software the name of which I'll have to look up, and lots of effort on getting my mail server configured correctly. + an ESP32 based modem. This took some doing and I still don't love how I'm doing it. I'll do a blog post about it eventually.)
- Social (mastodon via brutaldon via lynx via telnet over tor to an onion endpoint I run in my home, not ideal, or via BBS)
- Write (via Windows 3.1 notepad)

- Consult reference material (via the internet or gopher over my esp32 modem with the appropriate DOS software and SSL proxy, or more likely, via a real hacky thing I wrote to mirror a bunch of websites to a local web server.)
- Develop (frankly Via GW-BASIC, although I'd love to start doing real programming again.)
- Games (this is the thing the omnibook is worst at! I consider that a strength most of the time, but I do have a lot of parser based IF games on it.)

There was a time in the recent past when I used a Pentium MMX laptop as my only computer outside of work for weeks at a time.

It could do basically everything I wanted it to do, including some far more impressive games.

It's batteries gave out, finally, but until then it made a decent little computer.

The only real problem I run in to in these setups are the hoops I have to jump through because I'm the only one using them, and because (wireless) networking technology has advanced in ways that are not backwards compatible on the hardware level, while leaving laptops without a clear upgrade path.


@ajroach42 I used to think we'd naturally get to Star Trek the Next Generation as our tech brought us towards post scarsity. But it takes humans to do it.

This feels like a kind of rambling sidebar, but there's a point:

Most tasks that computers are used for on a daily basis could be completed on much less powerful hardware if there wasn't a profit incentive in the way.

So, circling back to the original point: I'm imaging a world in which computers are different.

Specifically, different in that they are designed to be cheap, easily repaired or replaced, and to just Do Their Job forever.

(This requires defining the job they are supposed to do.)

No one gets upset that their typewriter can't browse the internet, you know?

But a computer isn't an appliance, it's an everything machine, and as an Everything machine, if it can't do the New Shiny Thing we have to throw it away and get a new one.

That's the mentality I'm trying to break out of.

I want to define a(n extendable!) list of tasks that I feel like will be relevant indefinitely, and build a machine to last 30 years.

Which, finally, brings us back to the ESP32!

See thread:

Basically, the ESP32 is a simple microcontroller (that is to say, it's a computer! It's just not a computer the way we usually think about it.)

It's really cheap, like $3 or $4 for a simple board. There are folks making software for it already to treat it like a desktop computer.

It's not completely open or completely standardized or capable of everything I want out of my but ...

They get most of the way there on every count, and they have built in wifi and are so very cheap.

It would be entirely possible to base a new paradigm of multi-decade computers on the ESP32, but built in such a way as to be agnostic to the actual hardware (that is to say, you follow the write once run anywhere model of Java, and use the ESP32 as the host of a virtualized environment OR you build for the ESP32, and then emulate the ESP32 on newer hardware in the future)

This is basically the approach that Infocom took in the 80s when they developed text adventure games for every computer on the planet.

They invented a fake computer, compiled all their games for that fake computer, and then wrote emulators for that fake computer for every major machine of the era.

As a result, basically any computer can run an infocom game.

@ajroach42 This code has been getting published to Github btw, @enkiv2 likes linking to them!

Now, is the ESP32 a good home for a multi-decade computer?

I dunno!

It's a little more limited than I would have picked (I'd have probably stopped in the early multimedia era), but it's also way less power hungry than what I would have picked, and frankly significantly cheaper and easier to understand.

So I'm going to spend the next few months exploring the ESP32 as the basis for a purpose built computer that inherits the legacy of the tinkerers of the microcomputer era.

@ajroach42 So what your saying is that the problem with modern computing is that they are behaving more like people. Make no mistake, people are the problem.

Long and a bit rambly, sorry

@ajroach42 I have a powerful gaming desktop, a slightly less powerful gaming laptop, a good ish 2012 Thinkpad, and a bunch of Raspberry Pi's and 99% of what I do can be done on the Pi's (watching Internet things, and especially with the 4GB models that's more than enough to have 1 stream plus a few chats, plus Fedi, and probably still play Doom). There's only a few games I can't play on my Thinkpad, which has no GPU (and comes from the first gen of intel CPUs where they thought maybe we should make the GPU slightly more powerful than enough to display Win7 Aero)

You can do everything, slowly, on a Raspberry Pi, and you've been able to do it since the first Pi came out. 8 years later, and 8x the power per Pi, they're still considered "low power" and yet when I got my first RPi in 2012 the gaming rig I *dreamed* of having had 8GB RAM in it... Chillblast custom PCs could spec a max of 32GB at the time!

If we somehow convince enough people that something like a Pi is enough, then companies will *have* to bring their usage down.

Schools were meant to adopt the things (at least in the UK) but it never really happened because Microshaft has their Office suite, and programming languages so deeply ingrained it'd be like pulling the floor out from under the ICT curriculum...

@alcinnz @enkiv2 It has, yeah. It's not Free Software though. Just something that was archived.

Principles I plan to adhere to in my ESP32 exploration:

- Offline first, but networked
- Understandable is better than "easy to use"
- don't make assumptions about available hardware
- Don't re-invent the wheel without a good reason
- don't try to build a modern computer
- Decide what this machine should do, make sure it's good at those things.

Further thoughts:

Chip-8 - - Chip 8 is a virtual machine from the 70s for making games and software portable. It's part of the reason your graphing calculator plays games.

The 100 year computer project ( that sent me careening back down this path has a lot in common with chip-8 (and the article mentions it by name.)

I have other thoughts but it's dinner time. I'll revisit it.

@ajroach42 Espressif also made ESP32-C based on RISC-V (ESP based on Xtensa LX6 is now named ESP-S by Espressif). So this is a more long term evolution. They currently give away RISC-V based boards, if you mail to them and have good projects (else it’s abour 2~3€ the board :)

@popolon That is Interesting!

@ajroach42 Between your threads and my own rambling about low power devices and project ideas I keep coming back to wishing for basically an as open as it can be, simple, somewhat modern PDA type device based on an esp32 (or similar hopefully fully open microcontroller ideally) with some added features that PDAs didn't have. Mainly thinking about inputs and outputs that pdas lacked.

@ajroach42 This is neat but I feel like you got one thing wrong about your points. People don't build bloated software for profit (not always).

The reason is that they want something that makes development quick for them. Most of the tools that do that add bloat.

Modern programming has a focus on easiness for the programmer not for the hardware.

Companies do bloat things for tracking/obsolescence though of course.

Good luck on your journey with this!

@kelbot And I'm just trying to replace my Omnibook or my tandy 102.

(ultimately, these goals largely overlap.)

@ajroach42 Going to dig through some of my github stars and NC bookmarks a little later for some stuff I've saved that is relevant to this.

@ajroach42 Right! If something like this gets going I could see multiple slightly different form factors with a common base becoming pretty easy to accomplish.

@ajroach42 I've been reading through your goal of a multi-decade computer and it echos a lot of thoughts I've had. I'm building a house this spring and want to put in a whole home system that will hopefully be as future-proof as reasonably possible. It is a hard goal. My gut tells me that you're a little on the primitive end with your use cases, but not by much. I think things got good enough around 2010 on the hardware/software end. Any reason you would shy away from Raspberry Pi?

@daniel the end of the windows XP era is a good target in terms of features depending on your software and your use case.

I have some computers that age that get regular use.

I use and enjoy various raspberry pis in some special purpose builds. Outside of thag I find them to be, basically, just normal computers but slightly underpowered for some tasks. In my experience this leads to folks trying to use them like normal computers and then getting frustrated.

@daniel if the pi meets your needs, and you're confident you'll be able to keep it running in the future, I don't see any reason not to use it.

@ajroach42 The reason I strongly consider them is mostly the community and their prevalence. I think I'll be able to find one in 50 years.

We've talked a little about hardware. We've talked a little about use cases, but we should probably dig deeper in to that.

The remaining piece of this puzzle is software, which I think is closely tied to, but ultimately separate from, use cases.

I'll talk about that now, a bit, until I fall asleep.

So first things first, it's late and this might be incoherent.

Support for the Extensa core is being merged into LLVM, so your language and compiler options are going to be nice, and ESP-32s are going into lightbulbs so there's plenty potential for salvage

I've been working with zig as a front end for C to trying setting up a cross compiler environment. I've always hated builds so this project is especially easy to procrastinate on, but if one of us gets that working then it's pretty easy for someone to build that code for RISC-V or other LLVM supported architecture later

In order for a decade spanning computer to be remotely useful, it needs software that speaks common protocols and file formats.

These protocols and file formats should be open standards, well defined, and well documented.

In my current use cases, I mostly use plaintext files, csv, and HTML. When I need to use a more specialized software, I convert between an old file type and a new file type using a piece of open source software on a more modern system.

This takes the form of, for example, antiword or pandoc, running on Linux.

Ultimately, there are still these least common multiple kind of standards out in the world, and they're likely to stick around forever (can you imagine a world without plaintext files?), And converters to get back to these platform agnostic file types are likely to stick around too.

But filesystems, transfer protocols, etc? These things change, and often with good reason. Our system will need to keep up.


"Most tasks that computers are used for on a daily basis could be completed on much less powerful hardware if there wasn't a profit incentive in the way....

I want to define a(n extendable!) list of tasks that I feel like will be relevant indefinitely, and build a machine to last 30 years."

I mean to me this only leads one place; getting emacs to run on the absolute most minimum hardware :P

If you are making do on limited hardware you are mainly doing text editing at that point.

My solution to this problem so far has been intermediary computers.

One example: Fetch the emails or the RSS feeds on my laptop, convert them, shuttle them over serial to the old machine.

Another: use the older machine as a serial terminal to a more modern box. Do my actual work on the more modern box.

This extends the life of the older machines, and lets me access modern conveniences when I want them, but it doesn't actually provide a model for a computer that will remain relevant.

I dunno what the answer is here, but I suspect it's something like

1 - define a native format for networked data that you're willing to support.

2 - provide several common network interfaces including serial/uart and wifi.

3 - be willing to whack another machine on to the serial port and let it translate a new hardware or software protocol when the existing ones are no longer supported. (Such as what I do now with the omnibook and my ESP32 wifi modem.)

And 4 - be willing to adapt.

The point of this project, as I see it, is to provide a standard set of tools and protocols and file formats that should be relevant and workable for decades.

If 2.4 and 5ghz wifi stop being supported by new radios in ten years, or WPA2 is replaced with WPA3 or whatever, we write new firmware, install a new radio, or migrate all our data (stored in open formats), and software (open source and largely platform agnostic), somewhere else.

Treat it like a faulty part. Replace it. It's the ship of theseus. It's my grandfather's axe. The thing remains the thing becuase it works the same way and performs the same functions.

On the other hand, until such time as there is reason to do otherwise, and for as long as is reasonable after there is reason to do otherwise, support the standards, and fight to keep them backwards compatible.

@ajroach42 I think one significant goalpost was reached when peripherals got better than human senses could discern. Another was the maturity of software RAID systems. Yet another, was the plateauing of audio/video codecs. One of the biggest hurdles for me is the reliance of smart phones to capture recordings. They are too convenient and have amazing quality. Getting the iPhone to talk to Linux is always a PITA. If Linux smartphones take off, I have a hard time figuring out what else I need.

@daniel I'm not terribly worried about smartphone lockdown becuase I've stayed out of Apple's ecosystem in order to avoid it.

I think you made a valid point with the fact that monitors, for example, can show more detail than eyes can see.

I don't know that I consider that to be super relevant, but I'm also content watching films on DVD, so what do I know?

@daniel that's fair. They've stuck around for a while.

@yaaps I don't know zig, but am otherwise pleased with the contents of this post.


Emacs is already over 30 years old, it will be around as long as computers are around and coupled with org mode and it can do anything that involves text within a single wholistic workflow.

If a computer was designed around emacs/org mode you certainly wouldnt try to use it like a shiny new computer (and fall into that expectation trap that would keep you from truly interfacing with this computer as a new experience).

Anyways heres a pamphlet about the one true god that is emacs

@Alonealastalovedalongthe most of what I do, most of what most people do, with a computer is editing text and then sending that text to places.

Databases, spreadsheets, web pages, emails, IMs, all text at heart.

I imagine emacs has already been ported to the ESP32, but I haven't verified that.

I'm considering a slightly different approach, one that aims to be more proscriptive, but the emacs life is valid.

Last bit for tonight: peripherals.

PS/2 is fine. USB2PS2 adapters are cheap and easy and can be made by hand. You could even wire up a couple of USB ports that just concert straight to PS/2.

I don't love the idea of trying to support multipurpose USB 1 or 2, much less USB 3 or C, so I won't.

Printers are good and important. They're also pretty complicated. Serial printers exist, and lots of printers that don't speak serial do at least use postscript, so I'm confident we'll sort printing.

I guess one other thing to talk about is operating systems.

The thing is, I don't care.

As long as the OS itself isn't a hindrance to what I want, I don't care what it is.

Several folks are running CP/M on this hardware. I dunno why you'd want to do that/why you'd standardize around that, but I'm a DR fan and I like CP/M so let's fuckin go.

@ajroach42 on the protocol front, I imagine could fill that component.

Thanks for this thread, last year I started really thinking about what I need from computers too.

While zig is a new C-like language with some interesting features, what you'd be most interested in would be its utility as an alternative front end for Clang:

Besides the features in that blog, there's some nice things about the way zig manages libc for the cross compiling environment and tracks changes in your code to reduce unnecessary recompiling

But I'm not trying to open up rabbit holes under your feet while you're just crossing the field. I've got to crack open a couple laptops to shuffle SSDs around to get a working dev laptop again

@ajroach42 Computers per se not. But the operating systems most people use. Windows and Mac spy and sell advertisings and personal information.

Simply don't use these advertisement and lockin operating system.

@ajroach42 @EdS hey @stevelord you got namechecked in this thread of things you’ll totally agree with.

@stevelord @Alonealastalovedalongthe @ajroach42 wow!! I had no idea there was a LISP on CP/M!!

@ajroach42 what kind of viable business models can companies adopt so they'll resist the urge to use built-in obsolescence?
(Well other than proper management instead of oligarchic fat cat management)

@ajroach42 Thinking about virtual computer running on all kinds of hardware, and the probable reality of folks wanting to play games, Pico-8 - - is, I think, interesting. Not Open or Free software, but an example of low power computation providing excellent experiences. Dev environment built in. Really enjoying this thread. Thanks for sharing.

@comrad "simply"? That's bold phrasing, that I find patronizing. Tread carefully.

It's not just windows and mac. It's also Ubuntu, android, and nearly every major commercial website. The only way to escape surveillance is to radically redefine how you interact with a computer. Nothing simple about it.

@ajroach42 Android is Google, what do you expect?

Also just use a Google free Android version. And ubuntu does not necesserly track you, but beside that there are tons of other distributions beside stupid ubuntu.

@stevelord @grimmware @EdS thanks. I'm using this space to organize my thoughts. I think I still have a few more rabbit holes to traverse (operating systems in more detail, printers and other peripherals in more detail, supporting networks and cooperation, community, etc.) And I want to spill a few more bytes on ease of use versus ease of mastery vs comprehensibility.

Then I'll probably package up a blog post and try to roadmap out my next project.

@jos tic-80 and chip-8 live in similar spaces, but are open. tic-80 now, in response to pico-8. Chip-8 in the before times.

The ESP32 can run various other emulators, but also all of this kind of nicely dovetails against a thing I'll talk about later today with reference to how I used to interface with computers.

@comrad there we are again with that "just" as if it is actually simple.

It's a massive amount of work for the average user to break free from corporate surveillance, and there's not a viable web browser option that's trustworthy and secure.

... You know what? No. I'm not going to keep going with this conversation. Enjoy feeling superior.

@jos @ajroach42 When there’s a lot of lua calls inside a single frame it can require a little more juice than expected. Sometimes what can look simple can be expensive under the hood

@vesperto decentralized, worker owned, cooperatively managed, much smaller scale manufacturing coupled with some changes in consumer habits are the only path forward I see.

@ajroach42 Alright, I have a question. Internationalization support, and support for assistive technolgies. Do these fit in? I see a lot of these kinds of posts about making simpler computers, but they always seem to take large steps back in accessability for anyone who isn't an english speaker and they tend to be abelist.

@ajroach42 there is nothing superior with the simple desire to be free. It has not to be all in, but the start is not the technique but the desire.

One can assume that there has to be a desire in a person otherwise no external change will work out. And with that desire the technology or whatever else needs to be changed can be overcome.



This is a valuable callout, and a thing that is frequently overlooked in these projects.

I'm not an expert in internationalization, and I'm building this system for myself first, but I'm not naive or selfish enough to ignore these issues entirely.

When we're dealing with text, internationalization means UTF-8 support. When we're in the UI, it means language packs. Those are modern inventions that should be simple to carry over, even if I can't do translation work myself.

@errantlibrarian Accessibility is another world. I don't know what this world is like right now.

The best I can do is find whatever the existing best practices are, and attempt to meet those. Screenreader support seems like the most obvious and easy to implement, but I'm sure there are things I don't know about.

I said yesterday "No one gets upset when their typewriter can't connect to the internet" or something like that.

It was off the cuff. I hadn't thought about it much.

I'm thinking about it now.

No one gets upset that their typewriter can't connect to the internet because that typewriter could never connect to the internet.

That is a central component to this idea.

@ajroach42 Yep -- I like this idea, and this effort you're going for.

Networking seems to be the problem; the computers themselves, like you say, still do the same thing they were designed to do.

If a business was keeping track of sales, money in/money out with a computer from 25 years ago, they could technically do the same thing today.

You'd just need an extra "human interface" to enter the data from one system to another, replacing a networking protocol. :p

It's frustrating when a 10 year old computer can't get on facebook or watch 480p on youtube anymore, even thought it could five years ago.

The computer hasn't changed.

Facebook and youtube have become more complicated. They didn't need to, but they could get more complicated because the average computer got faster, and the average internet connection got faster over that time span.

So a computer that could do X lost it's ability to do X as a result of a third party.

That's frustrating.

So a portion of the multi-decade computer platform would need to be multi-decade support from network services.

When I was a web developer, we used to call this idea graceful degradation, or progressive enhancement.

The idea is: don't assume JS will be there, but feel free to use it if it is. Specifically, this means Don't Break if CSS or JS or images are missing.

We abandoned that idea in web development, and frankly most web developers never embraced it to begin with.

And even when web developers embraced it, your browser probably fetched JS you didn't execute. Your browser probably fetched images you didn't view. Your browser probably loaded CSS files, even if you didn't interpret them.

@ajroach42 a depressingly large number of people think that computers (mobile devices included) "wear out" with age and use due to this phenomenon. The fact is that in the past 20 years or so that the reason the same computer is slower at doing the same tasks now compared to then is almost entirely due to surveillance and targeted marketing practices.

Even if you're not mining bitcoin literally most of the energy expended by most computers is driven by this BS.

But if we can embrace decentralization, and specifically embrace a more peer-to-peer relationships, and open standards for communication, we can ensure that every multi-decade computer can communicate with every other multi-decade computer.

The multi-decade web?

The peer-to-peer web.

A web independent of the internet.


@mdm Sneakernet. UUCP

@trechnex I use lynx and mobile sites often.

But .... I mean, it's bad.

And a lot of these have started disappearing. Twitter's is gone, for example.

@ajroach42 You're right! Even better.

@ajroach42 I think you've not mentioned the one motivation behind all that JS and images that companies like that have: tracking users. Most of the JS that's loaded on web pages is for tracking people. That also goes for images that get loaded but which users don't see. JS especially has enabled this sort of digital surveillance.

@ND3JR It's a long thread, I mention that in the middle, but I don't hammer on it hard enough.

@ajroach42 DR-DOS was so much better than MS-DOS...

@mathew @ajroach42 I remember DR Dos. I think I had DR DOS5 and 6. I remember it had one of those disk compressors built-in and it had a primitive task switcher for DOS. It was like ... 8~9 MB though; huge back then. My dad deleted it for the space

@djsumdog @ajroach42
Not many people know that CP/M had multi-user support long before any Microsoft OS did.

@ajroach42 How are you getting Tor on DOS? Or have you somehow built that functionality into the WiFi modem?

@DHeadshot I'm running a wifi hotspot that's also a torproxy.

@ajroach42 free software is not enough. we need libre silicon and integrated circuits.

@ajroach42 it's designed by and for capitalists. That's it, that's the whole explanation for all of the above issues.

@zatnosk @ajroach42 -shakes fist at the profit motive-

@ajroach42 This sounds somewhat in line with the DAT protocol.

@ajroach42 Excited to hear about Tic80, downloaded it and am playing with it now!

@Sandra @ajroach42 Sure, something to keep in mind. :) I wonder if there are other emulators that could be run in that hardware and be performant, like a DS? Although then you would also need the tool chain to run in it too...

@jos @ajroach42 Yes, DS was actually where peeps were running into the limits:

@ajroach42 I have a 2006 32-bit machine (Dell Latitude) that has 4GB RAM. In 2017, it worked great, and I could do everything I needed and had still plenty of RAM untouched (no swap used!)

I left it home to embark on a long assignment out of town, taking another one with me.

When I came back a little over 8 months later, I could not use it with a browsing session without using swap anymore. And everything else was the same.

The web - slowly but surely - has turned to shit.

@kzimmermann Powerful anecdote, and pretty common.

I use a Thinkpad x220 as my daily driver right now. I've used this machine or a variant on it for *years.*

In the last 8 months, it has gotten progressively more difficult to use this machine in the same way I had been previously, to the point that I've given up on it.

Not on the machine, on the things that I used to do.

@ajroach42 I feel you.

Since I have moved around since then and left my 2006 machine at a friend's place, I have not had the chance since 2018 to try to use it again, and I'm quite curious to fire it up again.

Yet, I'm afraid that when I finally do so, it will also be my farewell bid to it.

And even with old-machine-oriented distros like Puppy Linux the usability factor has greatly deteriorated, since so many things moved to the browser (and data silos). That is a really sad reality.

@ajroach42 that's why we need to stop fucking using electron aaaaa

@ajroach42 just make some quick notes on your points here, and you make some good ones
-they spy because somebody wrote the code to do that, and made the decision to implement it.
-because of the spying
-exacerbated by the inability of the average, or even exceptional, person to manufacture computer chips at home, which enables continuous deprecation of designs to force movement towards the current most profitable products
-what was that old USENET guideline that got lost in the Eternal September? “Treat it like the library and don’t be an asshole?” Could’ve sworn I heard that from somebody. Even worse is that participating in those behaviors has become the expected and demanded norm, thus perpetuating the problem and enabling huge amounts of backlash against attempts to change it.
-an internet made up of private spaces that people think are public. A city of glass houses, where people think theirs is opaque.
A point you made later on, but I want to throw in here: things slowing down. Because programmers let their programs become resource hogs. Always pushing the edge of the hardware envelope to show the boss and the customers “look, here is how polished this is! As long as this the only program you have open”. Which, to be fair, has been the de facto paradigm for most programming (outside of speciality projects) since the beginning.

@theruran @ajroach42
So the closest we get to libre hardware with ARM (ignoring how ARM makes money and who's buying it) is FairPhone and Librem, which apparently have some binary blobs but they're disabled?

Then there's a RISC-V system scheduled for release next year. Totally libre hardware and it runs... Which Linux? ChromiumOS?

So maybe we set our sights a little lower, not that ESP-32 is problem free...

(Not all undocumented features could be this pleasant)

@yaaps @ajroach42 I did a class project this semester on this system:

I don't think everything there is libre hardware but it's about as close as is feasible today. They go through great lengths to employ evidence-based security and also discuss some of its limitations.

@ajroach42 This is one of the major obstacles to a multi-decadal computer I see. Both on the connectors (hardware) and the protocols. I wrote up some thoughts on


At least character sets are (mostly) backwards compatible (but I may have worked on EBCDIC systems).

@yaaps @theruran @ajroach42 That's cool...

@drwho @ajroach42 Great write-up!

I was just thinking... I think it's been mentioned before: to move away from silicon because Big Tech has a death grip on it. There may be cleaner and safer processes with other semiconductor materials? Thought I saw experimentation with graphite, too.

@ajroach42 Ha, awesome, will read tonight!

@Sandra @ajroach42 Oh that is different from what i was thinking, but also cool -- i was thinking about running a DS emulator on this CPU Andrew is talking about, not running Pico8/Tic80 on a DS itself -- but that *would* be cool!

@ajroach42 I'm pretty sure USB and PS/2 aren't compatible. Passive PS/2 controllers only work because later PS/2 devices also contained USB implementations, and auto-detected which protocol was in-use on power-on. (Happy to be corrected, because I want the world to work that way.)

@theruran @ajroach42 Thanks! It's a little dated, I haven't updated it in about six years, but I tried to give the general principles legs for the long haul.

There is some experimentation with graphite - graphenes in general, really. Experiments in using carbon nanotubes as transistors are still ongoing. Personally, I think rod logic might be a better application for them but at best I'm a somewhat well read laybeing in that particular field.

@wizzwizz4 USB and PS/2 are not necessarily compatible, yeah.

Most USB devices will operate when connected to a PS/2 adapter/port but it's not a requirement of the standard.

@ajroach42 This thread stirs up so many "yes!" thoughts in me, where to start?

I make a habit of periodically working from a . A realy VT510, says "digital" on the front and everything. It is connected to a Raspberry Pi on a cart, and from there I ssh to my "real" workstation (though the newer Pi4s are getting to be nearly capable enough of that on their own.) One of the biggest hurdles I face is UTF-8, but it is fantastic focus mode and a bit of a stop back in time.

@ajroach42 One problem with is its security model is, shall we say, from an earlier era. Neither strong cryptography nor strong authentication are built in, and both are a must these days. However, is a modern replacement that might be good:

Also I have been re-reading some of @joeyh's blog posts about offgrid living. Can't help but love modern always-on Internet, also can't help but think "I'll go online when the sun says I will" has a deeper purpose too.

@ajroach42 This whole thread resonated with me in the way I think about computers, being (pre-covid anyway) surrounded by computers that were 30, 40, 50, even 60 years old and still doing exactly what they were supposed to do, usually with greater speed and efficiency than anything made in the last two decades. I come from a world where if a computer crashes, instead of cursing the software, you'd check for rust or the gnawings of mice.

@msh @ajroach42 @solderpunk I recently left a couple comments on a hacker news thread about exactly this. We have mind-bogglingly powerful CPUs that are being used to interpret 5mb of JS to send a message or read a chunk of text (news article, recipe, whatever). Such a total waste.

@ajroach42 I think I walked the same path but ended up in a different place - I want to make a wooden-cased laptop with a 19" 3:4 screen and a mechanical DVORAK keyboard, and power it off an 18v drill battery. I figured that the case and screen holders would be slightly oversized, to accommodate incorporating different scavenged panels in the future. Actual computing would use a Pi or whatever little single-board computer was popular in the future, with room inside the case for extras.

@ajroach42 Basically I want to build a laptop that can My Granddad's Axe its way into 2100. The inside would have a thick panel on the bottom into which could be screwed PCB standoffs and you'd just whack whatever's handy in there. I guess I want a laptop built like a JAMMA cab, endlessly having its guts recycled and its bits reconfigured until only the shape of the thing hints at what it used to be.

@ajroach42 Most of all, really, I just want to make a laptop that's designed for the user. It'll be wood because wood is beautiful and people like beautiful things, and the only people who want computers to be plastic are the companies that can injection-mould a million cases exactly the same. It'll be big because there'll be a big battery and space inside to change it, the only people who want computers to be thin are the people who have to find the room in warehouses and lorries and ships.

@ajroach42 It'll have a high-res 3:4 screen because everything but gaming and spreadsheets is better with a high-res portrait-oriented monitor, the only people who want a meager 1080 lines are the people who want to make TV panels and monitor panels in the same factories. I think it's a sadly unusual idea that a machine could be made exclusively for the benefit of the person who'll use it, rather than having it be designed for ease of manufacture, marketing, transportation and obsolescence.


A *few* minutes :-)

@ajroach42 honestly kinda tempted to take one of my open source project/services and offer two ways for schools to exchange value for it. 1) paid subscription and 2) forming a club of students interested in helping maintain it.

@jos @ajroach42

Thanks for letting us know about tic80, works great on my pink 2DS, playing Cauliflower Power right now♥


Load MX or Antix as live media from a flash drive and it will be able to use facebook or Youtube again...

Without permanently installing the new operating system...

It's interesting looking at this from computers, i suppose this more representative of the larger problems that stem from neo-liberalism which encourages making profits from everything. Vanada Shiva talks about the patenting of seeds by way of gmos, Seeds are the best examples of opensource-ness, and trying to monetized and close source seeds are leading to horrible effects for the crop and farmers. but it bring this up b/c she notes, that the close-sourceness of the web has lead

@ajroach42 Yes, that’s pretty annoying.

I would not mind if the web required (free) software updates. But different from vp9 and av1 as codecs, h265 requires much, much more resources to decode than h264, so you lose video support on old computers when the platforms update to it.

@ajroach42 In retrospect, one of the most regrettable parts of the "PC revolution" is how much wheel-reinvention happened due to disconnects between large systems people and micro/PC people.

E.g. the high watermark of hardware-enforced security in operating systems is arguably still Multics…developed 1964-67. Just one example.

So much knowledge from the development of large multiuser systems was ignored or not known to PC developers, and we now live with the consequences.

@ajroach42 I remember when QVC was a big computer reseller and they were always pimping the latest and greatest hardware to their customers, who were basically all retired old people, telling them they needed the latest PENTIUM (or whatever it was at the time) to check their AOL email and read the latest news on AOL AARP forum.

So much wasted power for proplr doing very little.

@ajroach42 I remember when QVC was a big computer reseller and they were always pimping the latest and greatest hardware to their customers, who were basically all retired old people, telling them they needed the latest PENTIUM (or whatever it was at the time) to check their AOL email and read the latest news on the AOL AARP forum.

So much wasted power for people doing very little.

@ajroach42 on the plus side, you don't use Facebook with such a machine.

@ajroach42 I bet you could run uxn on an ESP32.

@msh @ajroach42 And that's why I have to love iOS devices. Don't get slower with age. Just gifted an iPhone 6 (with a small crack in the screen) to an older relative for free, to replace the buggy, pile of crap android device they were using.

Their Android phone was so buggy and slow they could barely send text messages -- the iPhone 6 is fast and quick as the day it was made. They're apparently loving it.

@mdm @msh @ajroach42 Interesting, I have an iPhone 6s with a broken screen that I thought about switching to from an Android device. I'm tempted to replace the screen myself and see if it's usable in 2021.

@poindexter @msh @ajroach42 An iPhone 6S? It definitely is. It's even still on Apple's "supported" listed and is getting regular iOS updates. (That's six years of updates, so far.)

I can't think of an Android phone that got more than 2. Maybe 3, at most.

@mdm @poindexter @ajroach42 Apple is closest to being acceptable for software support than any android vendor has ever been. They still get failing grades for actively making their hardware less repairable and more closed than could ever be justified.

Unless and until a vendor addresses both I am done with both android and iOS and have made my my daily driver. Perhaps if becomes more accessible to me I'll try android again.

Google recently pushed further with 5 years of monthly updates for Pixel 6 & 6pro (including all phone component firmware - which FairPhone can only do for a couple of years on the FP4)

Hopefully other devices will follow (Samsung went from 3 to 4 years of updates on top end devices recently)

The Pixel 6 & 6 pro are powerful & expensive. Will be interesting to see if Pixel 6a arrives & pricing

Use my Pixel 3XL as
@mdm @poindexter @ajroach42

@dazinism @msh @poindexter @ajroach42 I don't trust any of these companies to actually follow through with these promises -- I make my opinions of them based upon their actual past behavior, which, for updates, hasn't been great.

I don't believe Apple is supporting phones for longer than everyone else out of the goodness of their hearts (it's possibly just because their hardware is similar in between models), but the fact remains that nobody else even comes close.

@mdm @dazinism @msh @poindexter @ajroach42 The 2015 Fairphone 2 is still getting security updates.

For years Google have consistently delivered on the timelines they guaranteed for device updates.
Generally they do a month or 2 longer than guaranteed
In one case (a tablet a few years back) they did 6 months extra

Theres numerous options for keeping devices going longer. You dont get component firmware updates (many laptops & PCs dont get more than 2 years of these) after Google drop support, kernel & operaring system can be.
My favourite for older … 1/2

@msh @poindexter @ajroach42

…devices is unlike others they do verified boot where possible

@msh @poindexter @ajroach42

@dazinism I'm glad to see things moving in the right direction, though I can't help but feel the vendors are being dragged kicking and screaming due to overwhelming consumer demand and threat of government regulation rather than wanting to respect their customers.

Considering the maturity of the industry now 5 years is really a bare minimum and 10 years would be a good ultimate goal. I just retired a daily driver notebook that was 9 years old so why not? A bit absent from this discussion is repairability too.

@mdm @poindexter @ajroach42

My point was that laptops (and netbooks) arent really any better. Like your 10 year old netbook theres still folks who are using the 10 year old Samsung Galaxy S2 as their main phone

Its likely neither the S2 or your netbook has been getting proper updates to component firmware or drivers for most of their life

A real issues is the myriad of different (cheap?) devices. Whos going to spend the time properly maintaining the code for all these different devices?
@mdm @poindexter @ajroach42

@dazinism @msh @mdm @poindexter I think it's important not to throw "cheap" under the bus when talking about problems of support.

Maybe "cheap" devices do get updates for a reasonable amount of time, and many expensive devices don't. (Juicero et al.)

Cheap is good, because it enables adoption, which can encourage open source support.

Locked firmwares and DRM are at least partially to blame for the situation we are in now, and that happens in ever market segment.


Very much this. I think the connection between cheap and supported is pretty weak and that the most premium devices are vulnerable to loss of support due to closed hardware and firmware.

The challenges in supporting consumer devices are often created rather than inherent. If they were designed to be more openly accessible than community support would be able to better fill the gaps left by vendors in the long term.

@dazinism @mdm @poindexter

@ajroach42 did you happen to see John Carmack (of Doom/Quake fame) talk about his success getting Oculus to unlock the Go headset during sunsetting?

he has some wonderful ways of talking about these sort of concerns- about a desire for someone to, 10 years from now, 20, notice a dusty headset in a closet, pick it up, & be able to have maybe not a current experience, but be able to still update the OS, get apps & games on it, try it out, see what it's good for.