amelius 2 days ago

The problem is that vendors of hardware (read: nVidia) do not want the OS to have full control over the hardware. So instead they write a driver that the OS can talk to, and everything else is kept behind closed doors.

Meanwhile, companies like Apple who integrate everything can have full control, and are likely to come up with the better OSes in the future, but they are even more closed and the only talks we'll see about them are keynote speeches by the CEO.

  • grisBeik 2 days ago

    > The problem is that vendors of hardware [...] do not want the OS to have full control over the hardware

    I agree. At least the first half of the presentation blames the sordid status quo on Linux, all the while it is actually the responsibility of the hardware vendors. Linux not being the boot loader, Linux not being the firmware, Linux not being the secure firmware, etc etc etc is all the fault of the hardware vendors. They keep everything closed; even on totally mainstream architectures. On x86, whatever runs in SMM, whatever initializes the RAM chips, etc is all highly guarded intellectual property. On the handful select boards where everything is open (Raptor Talos II?), or reverse engineered, you get LinuxBoot, Coreboot, ... Whoever owns the lowest levels of the architecture, dictates everything; for example where Linux may run.

    > Meanwhile, companies like Apple who integrate everything can have full control

    Yes. Conway's law. As long as your SoC "congeals" from parts from a bunch of vendors, your operating system (in the broad sense the presenter uses the term in) is going to be a hodge-podge too. At best, you will have formal interfaces / specifications between components, and open source code for each component, but the whole will still lack an overarching design.

    Edited to add: systems are incredibly overcomplicated too; they're perverse. To me, they've lost all appeal. They're unapproachable. I wish I had started my professional career twenty years earlier, when C (leading up to C89) still closely matched the hardware. (But I would have had to be born twenty years earlier for that :/)

    Edit#2: the suggestion to build our own hardware is completely impractical. That only makes the barrier to entry higher. (IIRC, Linus Torvalds at one point wrote that ARM64 in Linux wasn't getting many contributions becasue there were simply no ARM64 workstations and laptops for interested individuals to buy and play with.)

    • js8 2 days ago

      While I largely agree, I think this is inaccurate:

      > Whoever owns the lowest levels of the architecture, dictates everything

      I think in IT, the people who can create most complexity for others, while keeping things relatively simpler for themselves, can dictate. Because these people then can sell the expertise, since they "produce" it cheaper than everyone else.

      Using HW barriers, or just closed-sourcing the stuff just happen to be quite effective ways how to make things complex for others and simple for yourself. Another way is to create your own language, standard or API. Yet another way is network barrier and data ownership (aka SaaS).

      My point is, it's possible to dictate on any level, not just the lowest.

      • grisBeik 2 days ago

        Thanks; this is a great thought! Let me try to refine it: "create irreplaceable complexity for others".

    • rjsw 2 days ago

      Another area that could be open and cooperating in the operating system is network controllers, most have an offload engine of some kind but you can't extend what it does or fix bugs in it.

  • hakfoo 2 days ago

    I'd argue the problem is that nobody enforced a strong peripheral API.

    We could have had a system like Commodore's intelligent peripherals-- a defined set of commands issued on predictable ports-- and it shouldn't technically matter how the device chooses to implement it. This lets the vendor do whatever custom special sauce they want, but it also means that any operating system that speaks the standard API will be able to support it.

    It could even have been moved one layer up-- letting them have some shim code running on the local machine, as long as it honoured a standard API at a sub-OS level. BIOS interrupts were a good example of this: everything from MFM hard discs to modern flash drives can all be supported with option ROMs providing interoperable INT 13h support.

    It fell apart first when software chose to bypass the BIOS and twiddle the hardware directly, and second when BIOSes became vestigial, not really reimagined for 32 bit/multitasking use cases.

  • dooglius a day ago

    Drivers are part of the OS

linguae 2 days ago

I listened to this 2021 talk from Timothy Roscoe and I believe it was interesting. In many ways it reminds me of Rob Pike's 2000 talk "Systems Software Research is Irrelevant," which also deplored the lack of OS research at the time. (For those of you who don't know, Rob Pike helped create the Plan 9 operating system at Bell Labs. He then moved on to Google and helped create the Go programming language.)

However, I wonder if the reason behind fewer OS papers describing radical departures from Unix/Linux, whether it's back in 2000 when Rob Pike spoke on this topic or in 2021, is because the incentive structures that govern researchers' careers discourage this type of work? Writing an operating system requires a lot of effort. One could shrug this off, saying that the problem is worth the effort, but many researchers face career pressures that make taking on the task of writing an operating system difficult. In corporate environments, it is often the case that research activities must be justified from a business standpoint, and it is often the case that the company's direction is driven by short-term pressures. While Roscoe could argue that it's in a company's interest to invest in operating system infrastructure that is better-equipped to deal with modern systems, it may be cheaper for the company, at least in the short term, to just modify Linux and call it a day. Pre-tenured academics such as grad students, postdocs, and assistant professors have to deal with the "publish or perish" game. Perhaps a professor who already has tenure could pursue an operating system project, but even with tenure there's still the matter of getting grant money, and the grad students who contribute to it are often concerned about their own research careers; they are just starting the publication game.

Maybe if we had corporate labs these days that functioned more like golden-era Bell Labs and Xerox PARC, and maybe if we had an academic environment with less pressure to publish steady results at top venues, there'd be more researchers willing to take risks and build operating systems with new designs rather than modifying Linux.

  • giantrobot 2 days ago

    > there'd be more researchers willing to take risks and build operating systems with new designs rather than modifying Linux.

    I don't know if that would be the case. While Bell Labs and Xerox PARC produced a lot of very interesting/useful research a lot of it was tied up in corporate licensing for decades. The corpse of AT&T Unix has been haunting the industry for decades and cost many millions of dollars in lawsuits.

    Linux ate the world largely because anyone could do what they wanted with it. Modifying or building on top of Linux will get you a very long way on commodity hardware you can get at Best Buy down the street for $200. You can spend a lot more time on your target of research rather than having to build the whole underlying system.

    If you've got some genius idea for a process scheduler instead of writing a whole kernel and whatever hardware drivers you need you can just hack it into Linux. You can then distribute it easily to other researchers or testers since it's just patches on a kernel they've already got running.

    I'm not saying Linux is the end-all be-all of OS design or systems research is pointless. It's just a pretty good starting point for a lot of research since it is free and quite capable on its own. As a researcher you get a lot of capability out of the box and a whole ecosystem of development tools all ready to use.

    • musicale 2 days ago

      > Linux ate the world largely because anyone could do what they wanted with it

      Linux ate the server world because 1) it didn't have server licensing fees like Windows NT and proprietary Unix 2) its closest competitor (BSD) was mired in lawsuits and uncertainty until 1994, 3) commodity x86 servers ended up competing very well on price/performance, and 4) there were possibly other factors like GPL vs. BSD, bazaar vs. cathedral, etc.

      On desktop and mobile, Linux did not exactly eat the world. Android and ChromeOS use Linux kernels however.

mike_hearn a day ago

There's plenty of R&D to be done on operating systems and systems software in general, but if you define an OS as merely a hardware abstraction then indeed there isn't because the hardware is, almost by definition, hard for researchers to change. Nor is there much reason to do so.

The problems that are interesting to research in systems are all at a level much higher than the kernel. They're around software distribution, distributed computing, user interfaces and so on. There's plenty of scope to do interesting stuff there but academia has mostly given up on doing this stuff, I think because better designs at higher levels aren't considered "research" by granting agencies/journals.

Most of the interesting OS research is getting done by cloud vendors, Facebook, Apple and startups these days.

rjsw 2 days ago

He could maybe have added a few slides giving examples of the kind of discussions that OS people might have with HW designers.

Then examples of workarounds when the HW designers don't listen.

Thinking in terms of the OS running on the GPU as well as the CPUs would change how we connect them together, recent ARM MALI GPUs could use the same VM address for a memory page as the CPUs but they don't in Linux.

johnea 2 days ago

OS's forgot about h/w?

  • schmidtleonard 2 days ago

    Yeah, it got buried and forgotten beneath three layers of pig lipstick and a candy crush ad.