So it seems, between the new AMD CPUs, Intel CPUs, and Nvidia GPUs, that everyone is pushing power consumption and heat as hard as they possibly can to get the most performance out of their chips.

I wonder if this is due to competition or if it's more to do with the end of Moore's Law where we're hitting the limit of current transistor tech.


Of course architectural changes can help, like how Apple went for ARM and how AMD is moving to a chiplet GPU design like they did for CPUs, but yeah... it's not looking good.

Also, just about everything is ridiculously expensive this generation. DDR5 RAM, motherboards, CPUs, GPUs, and new PSUs are all going up in price.

So consider this a reminder that the new shiny isn't necessarily better, and that "old" tech does the job just fine for most of what we need!


Show thread

an interesting suggestion by Clive Feather about implementing BCPL at the bottom of this article:

instead of having to scale addresses every time they're used (which was a pain on the byte-addressed machines where BCPL was popular, but is no biggie on machines like ARM¹ and IA32+, which support base plus scaled index addressing modes), implement BCPL to just ignore the bottom n bits of every word, where 2n is the number of bytes in a word

...of course BCPL was born into a world where the byte hadn't yet caught on anywhere but IBM's mainframe division, though; everything was word addressed in 1967

¹ except the Cortex-M0, because fuck microcontroller coders

Quick thread of random questions. Don't think too hard just answer

Would you rather have a prehensile tail, or be able to see clearly in the dark?

While we're all here, I also have a bunch of (mostly) ISA cards, Sound Blasters, some graphics cards. All of which are free (excepting the AWE 64 Gold).
Show thread

Anyone want a NEC PC-9821 Ae with Japanese keyboard, serial mice, and Intel Overdrive CPU? Price is free or at your discretion, so long as you make good use of it and are willing to cover shipping.

@millihertz In addition to Thinking Forth, the book (on lisp) by Paul Graham concentrates a lot on bottom-up design, which is a characteristic of that programming language as well. The book Let Over Lambda, though I have not read it yet, seems to have a chapter dedicated to exploring the similarities in design in both languages. I'm more familiar with Forth than lisp, but I find the existence of similarities interesting. It seems to come from the ease of changing the way the language works (by the ease of creating words and new data structures in Forth and the use of macros that pass functions to functions in lisp). These methods can be employed in other languages, and work well when they are, but are not generally used.

so this article strikes a chord...

but it feels like the author has got the diagnosis right, but landed on a prescription that's 180° wrong:

The entry barrier to programming needs to be high!

no, absolutely not! and if the author looks hard, this is obvious; the entry barrier to programming was higher back then only in terms of obtaining access. in every other sense, it was lower; even the largest computers were so much simpler that one person could grok them almost overnight, operating systems were laughably simple compared with today's, textual interfaces are almost laughably easy to code compared with modern GUIs, network security was just not someone anything cared about...

in contrast, there's a revealing quote from a reply on Hacker News:

At a minimum you need node, npm, webpack, babel, an spa framework, a frontend router, a css transpiler, a css framework, a test runner, a testing functions library, and a bunch of smaller things, and that's just what is "needed" to build a static website with a bit of interaction.

that's a lot of complexity to get on top of! it means that the entry barrier to programming is actually much higher than it was - to the point where people just don't have room in their heads for the high level stuff of web programming and a model of what the computers are actually doing with that stuff.

and yes, a static website with a couple of CGI pages is simpler - as long as you're OK with making every security mistake of the last three decades all over again. because network security is hard, and that's part of what drives the adoption of frameworks.

Show thread

the other part is kind of historical now - it's the mess that browsers were in between about 1995 and 2010, where nothing worked properly from one browser to another and the most widely used browser was hideously broken. a lot of frameworks evolved to hide the complexity of working out what CSS to present to which browser to get a uniform result.

in a way this complexity still exists - it's just, now it's the gulf between finger-steered mobile and mouse-driven desktop browsers.

(somewhere along the way that changed into forcing browsers to present things as though they were spreads in a Saturday magazine, which doesn't exactly help matters - but blame management and their insistence on hiring graphics designers for that. ;-)

the article continues...

Programming is engineering; it's not something where you throw stuff at the wall and see what sticks

trouble is, it's not just engineering. there's a lot of art involved too. that's where things get problematic...

Show thread

if it were just engineering, then simple solutions would obviously be the best. you don't want an innovative bridge that prioritises form first! and that's the thing: the barrier to designing a bridge is not high. people have been doing it for millennia! the barrier to designing a bridge that won't collapse, ever is high, but only in the sense of "you have to demonstrate an understanding of why innovative bridges are a bad idea before we let you".

but of course, everyone knows how to use a bridge. it doesn't require thinking about. (well, the "three ropes across a ravine" style of bridge does, i guess - and that's where 1970s/early-80s computing was, frankly.)

but coding isn't about that. coding is the head-on collision of engineering, architecture and anything-goes graphic design - and the designers are leading things, which is probably the natural consequence of what the web has evolved into. and that, i suspect, is why things have become so complex.

Show thread

hmm... i think this might have ended up in a different place from where i started.

the point i was initially intending to make is that the barrier to entry into computing has become unusably high and needs to be reduced again, by fundamentally rethinking how we can achieve the desirable aims without the colossal amounts of cruft that we've seen evolve around delivering them.

but i think where i've ended up is that the barrier of entry has actually become unscalable for people of an engineering mindset, because it's not for us any more. the web people couldn't / wouldn't wait for us to work out how to build proper, thousand-year viaducts - so they built what they needed on a foundation of three-ropes-across-a-ravine, and have been adding stuff to shore that up ever since. and they can't back out now, because they've been working on this stuff for three decades and there's too much of it now for them to do anything but keep piling on.

Show thread

i think there's room in the marketplace for a few more books on bottom-up app design than just Thinking Forth. the philosophy that Chuck Moore hints at therein:

I asked Moore how he would go about developing a particular application, a game for children. As the child presses the digits on the numeric keypad, fromzero to nine, that same number of large boxes would appear on the screen.

Moore [replied] "I don’t start at the top and work down. Given that exact problem, I would write a word that draws a box. I’d start at the bottom, and I’d end up with a word called GO, which monitored the keyboard."

necessarily, bottom-up programming starts with poking around your environment, getting a feel for both it and the problem and how the two will coalesce. which means defining a lot of words that might not go anywhere, until you figure out the particular shape of the problem you're solving - at which point things (should!) come together with striking rapidity.

Finally been able to do another bit of artwork over the last couple of days

Also filmed this to go up on YouTube and quite literally everything that could go wrong did- corrupted files, music not working etc but I got there in the end

CW animal eye contact

#MastoArt #artist #DigitalArt #cat

i can't live without my scalable antialiased fonts - but i am curious as to how much more memory and speed they take over the alternative

obviously, they can be implemented so that the answer is "not much at all" *points at RISC OS* but is that the case on modern machines - specifically modern machines using X11, which can no longer use the builtin X font engine (and take advantage of a shared glyph cache, for example) to do it?

also, how much more weight is imposed by the OpenType format than by RISC OS' own format?

the mastodon instance at is retiring 

for a lot of reasons, both personal and practical, i don't think i can effectively run a public mastodon instance any more, and i need to retire i want to do this responsibly and make sure everyone has a chance to migrate and export properly, so here is the plan for the instance retirement (feel free to share this link around):

due to ongoing moderation load issues and understaffing, between now and 1 november, the instance may be undermoderated. for other instances, if this poses an issue for you, i encourage you to use federation tools at your discretion. i would only ask that you be clear and public about your decisions so that users can figure out where they will be able to migrate their accounts.

thank you

At their very simplest, anarchist beliefs turn on to two elementary assumptions. The first is that human beings are, under ordinary circumstances, about as reasonable and decent as they are allowed to be, and can organize themselves and their communities without needing to be told how. The second is that power corrupts.
-- David Graeber

#anarchism #quote #bot

Show older

On the internet, everyone knows you're a cat — and that's totally okay.