I made a MacOS system 7 web desktop UI with real web browsing: https://win9-5.com/macos/
A re-imagining.
It's familiar and alien at the same time, like I'm seeing an alternate universe.
I made my own web-based Mac simulator some time ago: https://www.metamage.com/apps/maxim/
It was a way to become more familiar with CSS and JS (and indulge my classic Mac OS nostalgia), but my biggest takeaway was that the web wasn't a foundation I wanted to build complex structures on, and indirectly helped spur me to create Advanced Mac Substitute.
Will definitely give it a try, since I would _love_ to have a Classic Mac environment with some modern creature comforts (like file sharing) in tiny machines.
It only takes one unintentional reliance on an implementation detail to make an application not run on another OS implementation...
A 1 bit framebuffer and a CPU gets you most of what the machine can do.
Most of the quirk abuse of 8-bit machines came from features that were provided with limitations. Sprites, but only 8 of them, colours but only 2 in any 8x8 cell. Multicolour but only in one of two palettes and you'll hate both.
Almost all of the hacks were to get around the limitations of the features.
I don't know if the decision apple made was specifically with future machines in mind. It certainly would have been a headache to make new machines 5 generations down the track if the first one had player missile graphics.
So even things that wrote directly to the framebuffer would ask the OS for the address and bounds rather than hardcode them, copy protection would be implemented using license keys (crypto/hashes, not dongles) rather than weird track layouts on floppies, etc. It led to good enough forward compatibility that the substantial architectural changes in the Macintosh II were possible, and things just improved from there.
Much more common in my experience was the assumption that the framebuffer was 1-bit, but such games would still run on my IIci if I switched to black & white—they’d just use the upper left 3/4 of the screen since they still paid proper attention to the bytes-per-row in its GrafPort.
Could be that by the time I was using a Mac II though that all the games that didn’t meet that minimum bar had already been weeded out.
There is a story in Writing Solid Code by Steve Maguire [1] where Apple asked Microsoft to fix hacks in its Mac apps that didn't conform to the developer docs in Inside Macintosh because such workarounds were required when the apps were first developed alongside the original Macintoshes. However, Microsoft's workarounds would be broken by a major System Software update under development at Apple, which naturally wanted to avoid having to permanently add back the implementation bugs and quirks that the workarounds either relied on or were meant to avoid.
As Maguire told it, removing one such workaround in Microsoft Excel was hotly debated by the Excel team because it was in a hot-path 68k assembly function and rewriting the code to remove it would add 12 CPU cycles to the function runtime. The debate was eventually resolved by one developer who ran Excel's "3-hour torture test" and counted how many times the function in question was called. The total: about 76,000 times, so 12 more cycles each time would be about 910,000 cycles total... which on the Macintosh 128k's ~7 MHz 68000 CPU would be about 0.15 seconds added to a 3-hour test run. With the slowdown from removing the workaround thus proven to be utterly trivial, it was indeed removed.
[1] https://openlibrary.org/books/OL1407270M/Writing_solid_code - page 136, heading "Don't Overestimate the Cost"
It would be fun to have a "slow it down" feature that also has the various floppy read/write noises paired with it. Bonus points for different generations of hardware and having the OG HD noises to pair with those too!
It was discontinued in 2005, but the developers subsequently open sourced it and put the code on GitHub a couple years later. [2]
[1] https://en.wikipedia.org/wiki/Executor_(software)
[2] https://github.com/ctm/executor
Bonus: One of the engineers from ARDI, the startup that created Executor, was very briefly featured in Bob Cringely's 1996 documentary Triumph of the Nerds talking about the lifestyle of working at prototypical mid-90s Silicon Valley startup.
Decades later, though, emulation performance is mostly a non-issue (and even improves automatically with faster hosts). What matters now is portability (which requires ongoing maintenance) and renovation of programs designed around having the CPU to themselves (via dynamically applied patches).
For example, check out the MacPaint demo:
https://www.v68k.org/advanced-mac-substitute/demo/MacPaint-A...
If you were to double-click the Hello document in macOS' Finder, it would launch and open in MacPaint.app.
Many hours were wasted on that game.
and yes:
https://github.com/jjuran/metamage_1/commit/30cb0e260d5ff478...
So, rather than emulating hardware to run native ROMs, they "simply" reimplemented the ROMs.
A friend of mine did this at another level. He basically rewrote the bulk of the toolbox as a C library so that the company, who had a Mac application, could port it to run on a PC, while sharing the source code.
This was before Windows, and it worked! Launched it from DOS, takes over the entire screen. He didn't copy the Mac look and feel. Instead he used OpenLook for his gadgets and what not (since it was, you know, "open").
But he rewrote the bulk of it: QuickDraw, Event Manager, Memory Manager, Window Manager, etc. Just ate it like an elephant. I don't think his regions were as clever as the Mac. Pretty sure he just stuck with rectangles.
`TRAP` is a different instruction, with opcodes `$4E4x`. Each one gets its own exception vector.
It's not just trap calls, though — sometimes applications write directly to the sound buffer or use hardware page flipping.
The irony is not lost on me. :-)