As a longtime NeXTSTEP user, I still remember the first time I perused the filesystem of Mac OS X 10.0 (Cheetah) in 2001. And I distinctly remember thinking to myself that Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only. The filesystem and toolset was nearly identical, and I even found several config files (I don't remember which ones) that still had NeXTSTEP listed in the comments.
As I evolved to develop iOS apps in the 2010s, NSObject (NS=NeXTSTEP) continued to be a reminder of this same lineage.
> Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only
Yup, that’s precisely it, and Apple made no secret of this. As early as the mid 90s, Apple knew their OS was a dead end and they needed to start over with a new OS as the base. Their in-house effort (Copland) was becoming an obvious failure and it was apparent they needed outside help. At the time a lot of people thought BeOS was going to be it, but the deal fell through when it became apparent that having Steve come back was a priority, and buying NeXT came with him. But it was always gonna be the case that the OS was going to be totally replaced and only surface level UI conventions would be maintained from classic macOS.
A tangent I know, but looking at those old screenshots really made me miss that era of OS X. The first versions of Aqua with pinstripes were a bit busy for my liking, but by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
I am still very sad that the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
iOS 4 on an iPhone 4 and OS X whatever-it-was that was on the initial retina MacBook Pros are still very clear in my memory. Everything looked so good it made you want to use the device just for the hell of it.
I run an iMac G4 with 10.5 as a home music player. The strange thing is that it feels so easy to use. All the ingredients are the same in modern macOS but the feel is very different.
It’s hard to say why. Clarity in the UI is a big one (placement and interaction, not the theme, ie what we’d call UX today). But the look of the UI (colour, depth) really adds something too. Seeing a blue gel button sparks a sense of joy.
Apple is pretty clear in it's intention of making SwiftUI the blessed UI toolkit, however, they haven't deprecated AppKit or UIKit in any way and keep updating it, as they demonstrate at every WWDC with "what's new in AppKit" (e. g. for 2024: https://youtube.com/watch?v=McKYDogUICg).
They also provide means to mix-and-match AppKit and SwiftUI in both ways. In no way are they trying to "have you believe it's all SwiftUI now". It is simply the next generation of UI frameworks.
I feel that SwiftUI still has a ways to go. I like the ideas and philosophy, but the execution is still a work in progress.
First, they really need to beef up the docs.
Next, they need to stop punishing people for "leaving the lane." Not everyone wants their app to look and behave like a bundled Apple app.
I dislike Autolayout, and UIKit definitely has a lot of "old school" flavor, but with IB and UIKit, I can make an app that can do just about anything that I want.
This is a common refrain I hear w.r.t. Apple and while
I write very little native mobile code I have to agree. It’s so sparse and rarely has more that I should be able to find out by hovering over a function/variable in my IDE.
Would it kill them to add a paragraph explain why or how you use the thing? To add code samples? Must be too much for one of the most valuable companies in the world… What kills me is that this is actively hurting their own platforms. I can understand some of Apple’s moves but this one is so incredibly short-sighted.
I love autolayout. I've never felt it's been so easy to learn how to lay stuff out. InterfaceBuilder felt so confusing and arbitrary and like having to learn an entirely new language just to get basic behavior working correctly. Plus, it didn't compile to code but to a "nib" file you had to work with in abstruse and unintuitive ways. At least I can debug code; how the hell can i debug a nib? Most of the exposed functionality was difficult to google and assumed you new what all the obscure icons (like springs and arrows and lines) meant. Very confusing and frustrating.
Meanwhile autolayout was very intuitive. Define your variables, define your constraints, and it just magically works.
Absolutely. WO was a brilliantly designed framework (especially for the time) and being somewhat disillusioned with the state of web development in the last decade, I'm still using it as the UI layer for some of my own applications. It just can't be beat when it comes to throwing together a quick app, essentially being AppKit for the web. And as you say, it's influence was great, although I often wish it had a little more influence.
EOF was a great ORM framework as well and I never really understood ORM hate - until I had to use ORM frameworks other than EOF which generally feel … not that great. I ditched EOF a decade back though, due to it being, well, dead, and replaced it with Cayenne which is an excellent, actively developed ORM that feels very much inspired by EOF's design principles.
In the last few years, I've been working on a WO inspired framework (to the point of almost being a WO clone on the component/templating side) as a side project. It's still very raw when seen from the outside, no documentation and still operating under a bad codename - but hoping to make a release and port my remaining WO apps in the coming year. Hopefully it will add at least a bit to WO's influence on the web development world :).
Especially hilarious when you think of the rising popularity of HTMX.
WebObjects at the time revolutionary model of using the URL for state management would work really well with the new trend back towards server side rendered components.
Totally. I've been very happy to see the world embrace htmx in the last year and it's given me confidence knowing I'm doing the right thing with ng-objects.
The methodology htmx uses is in many ways identical to what we've been doing in the WO world for almost 20 years using Ajax.framework (which I don't know if you're familiar with), a WO plugin framework that most importantly adds "partial page updates". So you can wrap a part of a page/component in a container element, and target it so only that element gets rendered/replaced on the client side when an action is invoked (link clicked, form submitted etc.).
And yes, combined with WO's stateful server side rendering and URLs, it's ridicilously powerful. I usually design my WO apps so users never actually see a stateful URL, they always land on "static URLs" while stateful intra-page work happens through page replacements. I love it.
I remember the Unix-ness was a big part of OS X’s nerd popularity. People were talking about real Unix terminals, for example.
Later Windows also aimed for the same thing with their new console app and Linux support. Yet macOS has remained the same. The Terminal app feels essentially unchanged and there’s no good app package service (eg brew etc - these are third party and can mess up your system.)
Even Xcode is, well… look how extensions were restricted.
Modern macOS feels boring, but also not aimed at developers.
Is this supposed to be a bad thing?! It's a rock-solid workhorse. If they changed it I would stop trusting macOS to be any better than the linux flavor of the month
Back in early 2000s it was a top choice if you wanted some kind of unixy box with a polished GUI desktop that "just worked", especially if you wanted a laptop. BSD and Linux were fine, but as a desktop OS they were a very different experience from today, took way more tinkering even on a desktop PC as anyone who had to write their own X11 config will tell you. Today installing a Linux desktop distro is so easy and hardware compatibility is so good that the tables have turned, also if you are the type of user that wants a big DE (no judgement) the Linux DEs today are far more polished, people still complain today but if you go back in time it was a mess. These days MacOS seems extremely restrictive and awkward by comparison, on the one hand a huge chunk of the userland got stuck in time, while Apple have become more and more hostile to any kind of changes and customisations to the more unixy side of the system.
Sun had an agreement with Toshiba for Solaris laptops, but they were rather pricey.
UNIX is stuck in time, hardly anything improved beyond file systems, and small API improvements, and that is what macOS is measured against, POSIX certification.
To note that the only standard UNIX UI is CDE, and anything 3D isn't part of POSIX.
ZFS, BcacheFS, HammerFS... I think OpenBSD will have a better FS soon.
On modern FS', the plan9/9front ones are pretty much ahead of almost anything; but plan9 it's a Unix 2.0. It went further. On 3D, forget POSIX. GL was the de facto API and now Vulkan, and the most common middleware multimedia API it's SDL2.
While IrisGL was born on Irix, it was placed under ARB stewardship, which after Long Peaks disaster became Khronos.
Vulkan only exists thanks to AMD offering Mantle to Khronos, an API designed originally for game consoles, very much not UNIX, and had it not been for AMD, Khronos would still be thinking what OpenGL vNext was supposed to look like.
SDL also has very little with UNIX history, as it was created originally to port games from Windows to Mac OS (not OS X) and BeOS.
Do developers use the app store? 99% of what I install on my computer isn't available through the app store. I just use it for Apple apps (Pages etc). Pretty much everything else is freely available and more fully featured outside the app store.
Plus, it's spammed with low-quality for-profit crapware—the iOSification of an otherwise fantastic platform
The App Store install what you would install through .dmg or .pkg. This is, if you install, for example, Android Studio, Docker and UTM, you will have three QEMU executables, one for each app.
Homebrew does quite a good job as a package manager for Mac, however, it's far from how the package managers work in Linux distros. For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on. In Mac, I have to upgrade the system through system updates, homebrew packages through ``brew upgrade``, Python packages through pip, the App Store installed stuff through App Store and the manually installed apps through whatever the way they are upgraded.
> Along with analysis and debugging tools, Apple still gives away everything needed to build apps for the Mac, iPhone, or iPad.
Very conveniently glossing over the fact that developers still have to pay an annual Apple Developer Program subscription fee in order to be able to distribute their apps.
TANSTAAFL, as always.
Very conveniently glossing over the fact that if are developing for the Mac, no you don't. You can distribute it outside the store without paying anything.
If you choose not to pay Apple for the privilege of macOS development, you will need to teach users increasingly more arcane tricks to get the app running. As of the latest macOS release, the old trick of "right click -> open" stopped working, and the new trick is "open -> go to system settings and click on a magic button -> open again".
You don't pay Apple for the privilege of development, you pay them for the privilege of guaranteeing your users you are a legit developer who cares about their safety by registering and letting your app be reviewed.
Considering it would take less than a day for Apple's registration scheme to be overrun with billions of fake app builders if they don't put in a small monetary roadblock I don't see how this situation could be improved.
This has little bearing on desktop software, which usually doesn't go through the App Store. Apple does not (yet?) require review for traditionally distributed desktop app bundles or executable binaries. The developer fee is paid in that case just to get a signing certificate. The increasing number of hoops necessary to get unsigned things to run seems to just be funneling more developers into paying up and becoming beholden to Apple so they can stop the nagging of their users.
Around 2010, I started learning Objective-C to be part of the whole native mobile development movement. What I didn’t know when getting into this was how much of a history lesson I would have to participate in to understand the background behind so many aspects of the language and the core frameworks.
Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
Lisa and Mac were products of his seeing the Smalltalk GUI at his visit to PARC. There was nothing off-the-shelf, so they had to be built from scratch.
Of NeXT he said that he had been so bamboozled by the GUI at his PARC visit that he missed the other two, arguable more important concepts: OO and networking.
NeXT used as much off-the-shelf components as possible: Ethernet + TCP/IP for the network, Unix for the OS, Adobe's Display Postscript for graphics, Stepstone's Objective-C for the OO parts (which in turn mashed together C and Smalltalk). It bundled TeX, Sybase SQL Server, a bunch of scripting languages, Webster's dictionary, etc.
They only built themselves what they absolutely had to to get the machine and user experience they wanted.
> Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
See also, forking KHTML into WebKit to build Safari when MS cancelled Internet Explorer for macOS and the platform was left without a robust browser choice. For two reasons: That they were somewhat comfortable letting MSIE reign for so long rather than making an inhouse option, and for not starting over when they did.
He wasn't, his position regarding UNIX beards was well known.
Supporting UNIX was a business opportunity to go against Sun and other graphical workstations.
There are recordings of NeXT meetings, and his famous appearance at USENIX, regarding this.
Note that everything that matters on NeXTSTEP is based on Objective-C and Framework Kits, zero POSIX, beyond what was need for those government and graphics workstation contracts.
Steve Jobs left Apple and founded NeXT in late 1985 with the intent of developing a 3M computer: 1 MB of memory, 1 million pixels and 1 million instructions per second; or powerful enough to run wet lab simulations.
Jobs bought Pixar in 1986 when they developed their own computer systems. Luxo Jr. was shown at SIGGRAPH that same year, one part advertisement for their computer, and one part fun hobby project because some of the Pixar guys aspired to one day do a fully computer animated full length feature film of their own. This worked out very very well for them. Eventually, but they also stopped developing the Pixar Computer System in 1990 in part because Jobs was losing a lot of money propping up both NeXT and Pixar.
Development of NeXTSTEP began in 1986 under Avie Tevanian based upon the Mach kernel he had co-developed at Carnegie Mellon which was developed with the intention to replace the kernel in BSD, which at this point I believe is still just BSD and years away from fragmentation. NeXTSTEP 0.8 was previewed in October 1988 and all the core pieces were there: the Mach kernel, BSD, DriverKit, AppKit, FoundationKit, Objective-C runtime, and the NeXTSTEP GUI. 1.0 came in 1989.
IRIX 3.0 was released in 1987 debuting the 4Sight window manager which isn’t too similar to what was released in NeXTSTEP but does use NeWS and IRIS GL, however it was based on System V UNIX. It’s not until Pixar started making movies, I think actually starting with Toy Story, that they bought Silicon Graphics workstations. For Toy Story, the render farm also started off using SGI but eventually moved to Sun computers.
So if anything, IRIX and NeXTSTEP are probably a decent example of convergent evolution given they were both (at least initially) in the business of making high end graphical workstations and neither needed to reinvent the wheel for their target market.
Sure, but given the timeline, it’s unlikely the decision came about simply because he was influenced by “the Pixar guys”. I pointed out that the goal for the first NeXT computers was to be able to do wet lab simulations, and this was due to a conversation Jobs had with Paul Berg while Jobs was still at Apple. They met again after Jobs founded NeXT before drawing up the initial spec in September 1985.
More likely the decision to use Mach/BSD was because Avie Tevanian was the project lead for the operating system.
4Sight also didn’t debut until IRIX 3.0 (1987, also when it picked up the IRIX name), prior to that they used mex which I traced back as far as 1985 and prior to that I’m not sure, but I don’t think they had a window manager and it seems unlikely they would prior to 1985.
it's a far-fetched idea anyways. It's a five months difference; NeXT in sep '85, and pixar in feb '86.
More likely scenario is they wanted to come to market as fast as possible with limited resources, so porting Mach kernel and BSD (both proven/robust things) to their platform was probably the fastest route; It'd also have an existing base of developers to attract and carried some weight if they targeted workstation market.
edit:
this is what made me think why maybe he was influenced, since Steve Jobs did actually launch another "cube" two years before NeXTcube, which was developed in the time before him buying pixar. This thing required an SGI/Sun to be attached: https://en.wikipedia.org/wiki/Pixar_Image_Computer
But Unix workstations were a thing even before then: 68k-based systems were already around in the 1980s, with Sun (taking just one example) releasing their first product in 1982:
I mean, Mach 2 was cutting-edge and freely available from CMU. Probably less a love of UNIX and more the necessity of having a practical base for a new workstation OS.
As a longtime NeXTSTEP user, I still remember the first time I perused the filesystem of Mac OS X 10.0 (Cheetah) in 2001. And I distinctly remember thinking to myself that Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only. The filesystem and toolset was nearly identical, and I even found several config files (I don't remember which ones) that still had NeXTSTEP listed in the comments.
As I evolved to develop iOS apps in the 2010s, NSObject (NS=NeXTSTEP) continued to be a reminder of this same lineage.
> Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only
Yup, that’s precisely it, and Apple made no secret of this. As early as the mid 90s, Apple knew their OS was a dead end and they needed to start over with a new OS as the base. Their in-house effort (Copland) was becoming an obvious failure and it was apparent they needed outside help. At the time a lot of people thought BeOS was going to be it, but the deal fell through when it became apparent that having Steve come back was a priority, and buying NeXT came with him. But it was always gonna be the case that the OS was going to be totally replaced and only surface level UI conventions would be maintained from classic macOS.
NS stood for NeXT and Sun. NX was the original prefix before OpenSTEP.
Uh, that‘s interesting! Do you have a source for this (or is it firsthand knowledge?)
A tangent I know, but looking at those old screenshots really made me miss that era of OS X. The first versions of Aqua with pinstripes were a bit busy for my liking, but by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
I am still very sad that the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
iOS 4 on an iPhone 4 and OS X whatever-it-was that was on the initial retina MacBook Pros are still very clear in my memory. Everything looked so good it made you want to use the device just for the hell of it.
Long live Snow Leopard! It made my mac fly. A whole release dedicated to making Leopard better. It was amazing, peak macOS.
100% agree; if I could revive it to run it on modern arm hardware I would in a heartbeat.
I run an iMac G4 with 10.5 as a home music player. The strange thing is that it feels so easy to use. All the ingredients are the same in modern macOS but the feel is very different.
It’s hard to say why. Clarity in the UI is a big one (placement and interaction, not the theme, ie what we’d call UX today). But the look of the UI (colour, depth) really adds something too. Seeing a blue gel button sparks a sense of joy.
For me seeing old OS'es always remind me of the bad stuff. Slow CPU's, slow networking, slow disks, limited functionality.
Maybe I'm a bit too negative but for example when people romanticise stuff from the middle ages I can't help but think of how it must have smelled.
If only there was theming available to recreate those old formatting and styles.
Copland the failed OS NeXT was acquired to replace had themes.
https://lowendmac.com/2005/apples-copland-project
You also had Kaleidoscope[0]. That had some crazy themes[1].
[0] https://www.macintoshrepository.org/1706-kaleidoscope
[1] https://web.archive.org/web/20191021204432/https://twitter.c...
It's amazing that still today, you find NSStrings and NS prefixed stuff all over working code.
It's actually hard not to know anything about the old Appkit, as much as Apple would have you believe that it's all SwiftUI now.
Apple is pretty clear in it's intention of making SwiftUI the blessed UI toolkit, however, they haven't deprecated AppKit or UIKit in any way and keep updating it, as they demonstrate at every WWDC with "what's new in AppKit" (e. g. for 2024: https://youtube.com/watch?v=McKYDogUICg).
They also provide means to mix-and-match AppKit and SwiftUI in both ways. In no way are they trying to "have you believe it's all SwiftUI now". It is simply the next generation of UI frameworks.
I feel that SwiftUI still has a ways to go. I like the ideas and philosophy, but the execution is still a work in progress.
First, they really need to beef up the docs.
Next, they need to stop punishing people for "leaving the lane." Not everyone wants their app to look and behave like a bundled Apple app.
I dislike Autolayout, and UIKit definitely has a lot of "old school" flavor, but with IB and UIKit, I can make an app that can do just about anything that I want.
> First, they really need to beef up the docs.
This is a common refrain I hear w.r.t. Apple and while I write very little native mobile code I have to agree. It’s so sparse and rarely has more that I should be able to find out by hovering over a function/variable in my IDE.
Would it kill them to add a paragraph explain why or how you use the thing? To add code samples? Must be too much for one of the most valuable companies in the world… What kills me is that this is actively hurting their own platforms. I can understand some of Apple’s moves but this one is so incredibly short-sighted.
I love autolayout. I've never felt it's been so easy to learn how to lay stuff out. InterfaceBuilder felt so confusing and arbitrary and like having to learn an entirely new language just to get basic behavior working correctly. Plus, it didn't compile to code but to a "nib" file you had to work with in abstruse and unintuitive ways. At least I can debug code; how the hell can i debug a nib? Most of the exposed functionality was difficult to google and assumed you new what all the obscure icons (like springs and arrows and lines) meant. Very confusing and frustrating.
Meanwhile autolayout was very intuitive. Define your variables, define your constraints, and it just magically works.
Also the influence of WebObjects has been unappreciated.
EOF was probably the first ORM and Direct To WS the first web-based no-code tool.
Absolutely. WO was a brilliantly designed framework (especially for the time) and being somewhat disillusioned with the state of web development in the last decade, I'm still using it as the UI layer for some of my own applications. It just can't be beat when it comes to throwing together a quick app, essentially being AppKit for the web. And as you say, it's influence was great, although I often wish it had a little more influence.
EOF was a great ORM framework as well and I never really understood ORM hate - until I had to use ORM frameworks other than EOF which generally feel … not that great. I ditched EOF a decade back though, due to it being, well, dead, and replaced it with Cayenne which is an excellent, actively developed ORM that feels very much inspired by EOF's design principles.
In the last few years, I've been working on a WO inspired framework (to the point of almost being a WO clone on the component/templating side) as a side project. It's still very raw when seen from the outside, no documentation and still operating under a bad codename - but hoping to make a release and port my remaining WO apps in the coming year. Hopefully it will add at least a bit to WO's influence on the web development world :).
https://github.com/ngobjects/ng-objects
https://www.youtube.com/watch?v=-obvt93wSFc
Especially hilarious when you think of the rising popularity of HTMX.
WebObjects at the time revolutionary model of using the URL for state management would work really well with the new trend back towards server side rendered components.
Totally. I've been very happy to see the world embrace htmx in the last year and it's given me confidence knowing I'm doing the right thing with ng-objects.
The methodology htmx uses is in many ways identical to what we've been doing in the WO world for almost 20 years using Ajax.framework (which I don't know if you're familiar with), a WO plugin framework that most importantly adds "partial page updates". So you can wrap a part of a page/component in a container element, and target it so only that element gets rendered/replaced on the client side when an action is invoked (link clicked, form submitted etc.).
And yes, combined with WO's stateful server side rendering and URLs, it's ridicilously powerful. I usually design my WO apps so users never actually see a stateful URL, they always land on "static URLs" while stateful intra-page work happens through page replacements. I love it.
It is basically a whole generation rediscovering what we were doing in the 2000's, now that SPA craziness went too far.
It also influenced the design of Distributed Objects Everywhere at Sun with OpenStep, which eventually got rewritten into what became Java EE.
Anyone familiar with Java EE will find a familiar home in WebObjects, specially the Java rewrite.
I remember the Unix-ness was a big part of OS X’s nerd popularity. People were talking about real Unix terminals, for example.
Later Windows also aimed for the same thing with their new console app and Linux support. Yet macOS has remained the same. The Terminal app feels essentially unchanged and there’s no good app package service (eg brew etc - these are third party and can mess up your system.)
Even Xcode is, well… look how extensions were restricted.
Modern macOS feels boring, but also not aimed at developers.
> The Terminal app feels essentially unchanged
Is this supposed to be a bad thing?! It's a rock-solid workhorse. If they changed it I would stop trusting macOS to be any better than the linux flavor of the month
Back in early 2000s it was a top choice if you wanted some kind of unixy box with a polished GUI desktop that "just worked", especially if you wanted a laptop. BSD and Linux were fine, but as a desktop OS they were a very different experience from today, took way more tinkering even on a desktop PC as anyone who had to write their own X11 config will tell you. Today installing a Linux desktop distro is so easy and hardware compatibility is so good that the tables have turned, also if you are the type of user that wants a big DE (no judgement) the Linux DEs today are far more polished, people still complain today but if you go back in time it was a mess. These days MacOS seems extremely restrictive and awkward by comparison, on the one hand a huge chunk of the userland got stuck in time, while Apple have become more and more hostile to any kind of changes and customisations to the more unixy side of the system.
Sun had an agreement with Toshiba for Solaris laptops, but they were rather pricey.
UNIX is stuck in time, hardly anything improved beyond file systems, and small API improvements, and that is what macOS is measured against, POSIX certification.
To note that the only standard UNIX UI is CDE, and anything 3D isn't part of POSIX.
ZFS, BcacheFS, HammerFS... I think OpenBSD will have a better FS soon.
On modern FS', the plan9/9front ones are pretty much ahead of almost anything; but plan9 it's a Unix 2.0. It went further. On 3D, forget POSIX. GL was the de facto API and now Vulkan, and the most common middleware multimedia API it's SDL2.
Yeah, but none of that is UNIX(tm).
While IrisGL was born on Irix, it was placed under ARB stewardship, which after Long Peaks disaster became Khronos.
Vulkan only exists thanks to AMD offering Mantle to Khronos, an API designed originally for game consoles, very much not UNIX, and had it not been for AMD, Khronos would still be thinking what OpenGL vNext was supposed to look like.
SDL also has very little with UNIX history, as it was created originally to port games from Windows to Mac OS (not OS X) and BeOS.
> I remember the Unix-ness was a big part of OS X’s nerd popularity.
* https://www.gocomics.com/foxtrot/2002/02/25
> there’s no good app package service
It's called the App Store.
Do developers use the app store? 99% of what I install on my computer isn't available through the app store. I just use it for Apple apps (Pages etc). Pretty much everything else is freely available and more fully featured outside the app store.
Plus, it's spammed with low-quality for-profit crapware—the iOSification of an otherwise fantastic platform
Not exactly...
The App Store install what you would install through .dmg or .pkg. This is, if you install, for example, Android Studio, Docker and UTM, you will have three QEMU executables, one for each app.
Homebrew does quite a good job as a package manager for Mac, however, it's far from how the package managers work in Linux distros. For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on. In Mac, I have to upgrade the system through system updates, homebrew packages through ``brew upgrade``, Python packages through pip, the App Store installed stuff through App Store and the manually installed apps through whatever the way they are upgraded.
When's the last time you used the App Store to install a CLI utility or a shared library?
I don’t think you can call the Mac App Store “good”.
I remember seeing finder running on NeXT at a Halloween party at he Omni group in 1999. That was a cool experience.
> Along with analysis and debugging tools, Apple still gives away everything needed to build apps for the Mac, iPhone, or iPad.
Very conveniently glossing over the fact that developers still have to pay an annual Apple Developer Program subscription fee in order to be able to distribute their apps. TANSTAAFL, as always.
Not just distribute, even to run them locally on your own devices for longer than a few days.
Very conveniently glossing over the fact that if are developing for the Mac, no you don't. You can distribute it outside the store without paying anything.
iOS, yep you're right.
If you choose not to pay Apple for the privilege of macOS development, you will need to teach users increasingly more arcane tricks to get the app running. As of the latest macOS release, the old trick of "right click -> open" stopped working, and the new trick is "open -> go to system settings and click on a magic button -> open again".
You don't pay Apple for the privilege of development, you pay them for the privilege of guaranteeing your users you are a legit developer who cares about their safety by registering and letting your app be reviewed.
Considering it would take less than a day for Apple's registration scheme to be overrun with billions of fake app builders if they don't put in a small monetary roadblock I don't see how this situation could be improved.
This has little bearing on desktop software, which usually doesn't go through the App Store. Apple does not (yet?) require review for traditionally distributed desktop app bundles or executable binaries. The developer fee is paid in that case just to get a signing certificate. The increasing number of hoops necessary to get unsigned things to run seems to just be funneling more developers into paying up and becoming beholden to Apple so they can stop the nagging of their users.
Until the mid-2010s, most apps were unverified and people trusted the distribution channels where they got them from.
iOS can sideload. Is that not allowed in the development license?
Around 2010, I started learning Objective-C to be part of the whole native mobile development movement. What I didn’t know when getting into this was how much of a history lesson I would have to participate in to understand the background behind so many aspects of the language and the core frameworks.
I miss that era!
It surprised me that Steve Jobs would be so open to unix.
I thought with his not invented here syndrome and desire to control everything and attraction to simplicity and graphical UI he would have hated unix.
How did he come to love unix enough to build NextStep on it?
Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
Lisa and Mac were products of his seeing the Smalltalk GUI at his visit to PARC. There was nothing off-the-shelf, so they had to be built from scratch.
Of NeXT he said that he had been so bamboozled by the GUI at his PARC visit that he missed the other two, arguable more important concepts: OO and networking.
NeXT used as much off-the-shelf components as possible: Ethernet + TCP/IP for the network, Unix for the OS, Adobe's Display Postscript for graphics, Stepstone's Objective-C for the OO parts (which in turn mashed together C and Smalltalk). It bundled TeX, Sybase SQL Server, a bunch of scripting languages, Webster's dictionary, etc.
They only built themselves what they absolutely had to to get the machine and user experience they wanted.
> Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
See also, forking KHTML into WebKit to build Safari when MS cancelled Internet Explorer for macOS and the platform was left without a robust browser choice. For two reasons: That they were somewhat comfortable letting MSIE reign for so long rather than making an inhouse option, and for not starting over when they did.
It’s funny that Apple originally wanted Mozilla (proto-Firefox) but couldn’t figure out how to build it on Mac OS X in a reasonable amount of time.
And WebKit eventually birthed Chromium. Truly the circle of life.
He wasn't, his position regarding UNIX beards was well known.
Supporting UNIX was a business opportunity to go against Sun and other graphical workstations.
There are recordings of NeXT meetings, and his famous appearance at USENIX, regarding this.
Note that everything that matters on NeXTSTEP is based on Objective-C and Framework Kits, zero POSIX, beyond what was need for those government and graphics workstation contracts.
Maybe he got influenced by Pixar guys: https://www.youtube.com/watch?v=iQKm7ifJpVE
Even though IRIX had its quirks.
I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Steve Jobs left Apple and founded NeXT in late 1985 with the intent of developing a 3M computer: 1 MB of memory, 1 million pixels and 1 million instructions per second; or powerful enough to run wet lab simulations.
Jobs bought Pixar in 1986 when they developed their own computer systems. Luxo Jr. was shown at SIGGRAPH that same year, one part advertisement for their computer, and one part fun hobby project because some of the Pixar guys aspired to one day do a fully computer animated full length feature film of their own. This worked out very very well for them. Eventually, but they also stopped developing the Pixar Computer System in 1990 in part because Jobs was losing a lot of money propping up both NeXT and Pixar.
Development of NeXTSTEP began in 1986 under Avie Tevanian based upon the Mach kernel he had co-developed at Carnegie Mellon which was developed with the intention to replace the kernel in BSD, which at this point I believe is still just BSD and years away from fragmentation. NeXTSTEP 0.8 was previewed in October 1988 and all the core pieces were there: the Mach kernel, BSD, DriverKit, AppKit, FoundationKit, Objective-C runtime, and the NeXTSTEP GUI. 1.0 came in 1989.
IRIX 3.0 was released in 1987 debuting the 4Sight window manager which isn’t too similar to what was released in NeXTSTEP but does use NeWS and IRIS GL, however it was based on System V UNIX. It’s not until Pixar started making movies, I think actually starting with Toy Story, that they bought Silicon Graphics workstations. For Toy Story, the render farm also started off using SGI but eventually moved to Sun computers.
So if anything, IRIX and NeXTSTEP are probably a decent example of convergent evolution given they were both (at least initially) in the business of making high end graphical workstations and neither needed to reinvent the wheel for their target market.
SGI use within Lucas Film (and thus Pixar) goes way back to IRIS 1000/2000 era, so definitely 83/84 afaik.
Sure, but given the timeline, it’s unlikely the decision came about simply because he was influenced by “the Pixar guys”. I pointed out that the goal for the first NeXT computers was to be able to do wet lab simulations, and this was due to a conversation Jobs had with Paul Berg while Jobs was still at Apple. They met again after Jobs founded NeXT before drawing up the initial spec in September 1985.
More likely the decision to use Mach/BSD was because Avie Tevanian was the project lead for the operating system.
4Sight also didn’t debut until IRIX 3.0 (1987, also when it picked up the IRIX name), prior to that they used mex which I traced back as far as 1985 and prior to that I’m not sure, but I don’t think they had a window manager and it seems unlikely they would prior to 1985.
yeah, makes sense.
it's a far-fetched idea anyways. It's a five months difference; NeXT in sep '85, and pixar in feb '86.
More likely scenario is they wanted to come to market as fast as possible with limited resources, so porting Mach kernel and BSD (both proven/robust things) to their platform was probably the fastest route; It'd also have an existing base of developers to attract and carried some weight if they targeted workstation market.
edit: this is what made me think why maybe he was influenced, since Steve Jobs did actually launch another "cube" two years before NeXTcube, which was developed in the time before him buying pixar. This thing required an SGI/Sun to be attached: https://en.wikipedia.org/wiki/Pixar_Image_Computer
> I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Jobs became majority stakeholder of Pixar in 1986. NeXT was incorporated in 1988.
* https://en.wikipedia.org/wiki/Pixar#Independent_company_(198...
* https://en.wikipedia.org/wiki/NeXT_Computer
But Unix workstations were a thing even before then: 68k-based systems were already around in the 1980s, with Sun (taking just one example) releasing their first product in 1982:
* https://en.wikipedia.org/wiki/Sun-1
IRIX-based systems on MIPS were big in the 1990s (post-Jurassic Park), but SGI also started with 68k-based systems.
They launched NeXTcube in 1988, but they incorporated in sep 1985.
I mean, Mach 2 was cutting-edge and freely available from CMU. Probably less a love of UNIX and more the necessity of having a practical base for a new workstation OS.