[Rumor] Windows 12 to arrive in fall 2024 with a floating taskbar and a focus on AI

Look at how many recent Terminal setup tutorial for Apple-linux-windows you can find, look at the effort Microsoft has put in their terminal in recent years:
https://github.com/microsoft/terminal
How active it is (8200 fork, 1500 issues, 500 branches, 99 releases in 5 years) :
https://github.com/microsoft/terminal/issues

Command line is still extremely popular. (Look how popular vim still is)


You can go back to the previous right-click ways:
reg add "HKCU\Software\Classes\CLSID\{86ca1aa0-34aa-4e8b-a509-50c905bae2a2}\InprocServer32" /f /ve

restart explorer.exe after.
Talking about end users, not server admins. Terminal is very powerful, which is why Microsoft implemented WSL into Windows. You can get stuff done much quicker with terminal commands than with just the UI. Also a server admin wouldn't want a demanding UI wasting resources.
What??? Please turn in your gray beard immediately. Especially if you interact with servers.

I use the terminal constantly... it just comes down like a Quake console for me. Microsoft have finally gotten their act together and made a good terminal as well.
Why is it that everyone tries to find exceptions? We're talking about someone who isn't a server admin, which is like 99% of people. You can download Ubuntu and install it easier than a Windows user, and it doesn't push you to log in to an online account. You don't even have to install drivers with the exception of Nvidia. Back in 2013 you did need to type a lot of commands to get things like WiFi and sound working, but today it all just works. Steam just runs the games, assuming they're not online multiplayer that needs anti-cheat.
 
Probably longer. The first Linux distro was in 1992!!


Amen brother. Years and years ago I worked for a startup that was doing a multi-user small business system running on (yeah!) System 3 with 4.1 bsd extensions. No kidding. And no desktop. So I actually got proficient in csh to the point of writing scripts that were edited in vi. At that time I had a CP/M system at home so a cli-system at work was what it was. Then in the mid-90s, I worked for Sun Microsystems, with a GUI desktop on top of Solaris. I still had to use native UNIX commands sometimes.

Fast forward to today. Running desktop and laptop with Win 10 Pro 2H22. Wife's system is a similar laptop. All set up with LAN, LAN-based backup and synchronization, LAN printers, etc. Life is good. (No more BSODs). Then I need to run a program (TL:DR) that runs only on Linux so I loaded up WSL. I wanted WSL so I wouldn't need to multi-boot. I HATE IT. Same as for Tengis. Just trying to access a Windows drive takes a bunch of steps, and is freakin finicky. :nailbiting: Or "fiddly," as our British friends would say. Yes, I still remember a lot of UNIX, and Linux commands are the same. But commands with umpty-wumpty options that take 3-4-5 tries to get right. Fuggedaboudid. Man pages? :mad: Poor substitute for a GUI-based system.

Last time I heard, Linux share of desktop users was like 2.8%. Linux itself is the reason why. :LOL:

Same for software. Can I get my daily driver software running on Linux? Most emphatically not. Not Outlook, not Photoshop, not Quicken, for starters. And most of the time that I use my PC, I'm "using" it, not "hoppying" it.


+1 to Tengis. I am 146% in agreement with him. I like the guy, even though we haven't met. :happy:
My experience with Linux is quite different than yours. Windows 8.0 was so bad that I started dual booting with Linux and eventually moved my Windows partition to a VM. The majority of the applications I used regularly under Windows are also available on Linux (Libre Office, Thunderbird, VLC, Vivaldi, Firefox, jEdit, Lazarus, steam, g++, make, etc). I usually startup the Windows VM after Linux is done booting so I'll have access to its' drives (via SMB shares). I really only use Windows for games that won't work under Linux, Affinity Photo (I really dislike Gimp!) and TurboTax. My printers work under Windows and under Linux so that's not a problem either.

Doh, forgot about TurboTax, I use the VM for that too. It's only for a about a week when I do my taxes.
 
People have been saying this for 20+ years and Linux is borderline just as unusable as its always been. Sure, basic desktop functions are fine, until you end up having to run 20+ things in command line for something when the equivalent would take 30 seconds in Windows. I also have three pieces of equipment here that I use regularly that dont even work in Linux - my time is more valuable than "sticking it to the man" by using open source software.

You do you

I find that I am more efficient in Linux than in Windows. Updates are a snap and relatively instant. No waiting like in Windows. And I find things just work.

I try to not use little consumer trinkets without universal support on my PC anyway, so I don't really run into hardware compatibility problems. Only precaution I take is to make sure WiFi chips are compatible before I buy a device. But I barely use WiFi at all, so this is a once in a decade thing when I set up a new laptop kind of thing. Everything else is wired.

I used to worry about software compatibility in Linux. Over a decade ago I needed iTunes to sync my iPod for my car and later my iPhones, but no one uses iPods anymore, and I switched away from Apple in 2012. (I understand the new iPhonea no longer rely on iTunes either)

I also used to worry about not being able to use Adobe software, but then Adobe went subscription, and that caused my to swear I would never use their shit again, and I haven't.

Same with MS Office lately. I used to worry about it working, and even used CrossOver office to get it working. I swore I'd never pay for a 365 subscription. For a while I used the static 2016 version. Tried to upgrade to the static 2019 version, but even after buying the key I needed a Microsoft account to install it, and I was like "fuck that" the reason I but static versions is that I refuse to have a Microsoft account.

So now I don't use Office either at home. LibreOffice will have to do.

So suddenly there really isnt even any software I miss that I would have been using in Windows anyway. The only exception is games. I keep a stripped down Windows install with as much as is possible disabled just for games. That's all it is good for anymore, in my book.

I find windows causes me more trouble than Linux does. It isn't 2003 anymore.
 
What you're describing is GNU/Linux from 2013, not 2023. Nobody types commands in Linux unless they're into stupid things like tweaking.

I don't find that to be an accurate statement at all.

I daily everything from the command line.

Want to run an up date?

sudo apt update
sudo apt dist-upgrade


Want to install a package?

apt-cache search <string>
sudo apt install <package name>


Want to find installed packages?

apt list --installed |grep -i <string>

Want to remove a package?

sudo apt remove <package>

Want to automatically clean up dependencies?

sudo apt autoremove

Want to configure a package?

sudo nano /etc/<text config file> and edit commented text file

Starting, and stopping services, mass renaming files, finding that lost file, copying/moving file repositories, using dd to image or restore a drive, etc. etc.

Everything is souvh faster and efficient from the command line. The GUI is slow and frustrating by comparison as you ahvr to sing through things and visually look for settings without the advantage of a search feature.

The overheleming majority of GUI solutions are a significant downgrade over the console

That's one of my biggest frustrations with computers these days. That people have convinced themselves that "text is hard". It's really dumb. Yes, text based computer management has a small learning curve, but once you are past it it is SO much more efficient than the alternative.

There is nothing I hate more than having to configure something in Windows and be faced by tab after tab of check boxes and radio buttons struggling to find that one setting I am looking for. Win10+ have become a little more manageable as the settings are at least partially searchable, but still, nothing beats a good text file config. Nothing.

This is why I wish netplan would go to hell. Bring back if up/down. It was SO much easier and more convenient to use.

This is how I use Linux. This is how everyone I know uses Linux.

What Linux users even use the Linux GUI? Your grandmother / aging parent who you dumped Ubuntu or Mint on a laptop for so she could use her aging computer and not mess it up between your visits?

Like, semi serious question. I have literally never even heard of Linux users using Linux the way you describe.


Next, I would like to complain about snaps, flatpak and appimage, all of which can fuck right off. Any developer who distributed their software in them needs to be stabbed. System package manager or bust.
 
Last edited:
Talking about end users, not server admins. Terminal is very powerful, which is why Microsoft implemented WSL into Windows. You can get stuff done much quicker with terminal commands than with just the UI. Also a server admin wouldn't want a demanding UI wasting resources.
Talking about a very common end users of a OS system, a very common jobs (maybe a top 25 one in the western world depending how you cut it) like software or web developers. Those tutorials in question will rarely be for servers admins. It is not really niche or the exception, specially among desktop Linux users.
 
The thing that bothers me about Linux is no HDR or color depth settings. I use Windows for entertainment and Linux for everything else.

Yep.

It's not just a lack of settings. HDR functionality is completely absent in Xorg and Wayland.

That's one big gaping hole in the capability of Linux systems for entertainment purposes right now. Some special purpose systems running linux can pass HDR video to a screen as part of hardware video decoding, but that's about it. We are talking, something like CoreElec running on an Odroid, or something like that.

It seems the ball was really dropped on this. I can understand why Xorg doesn't support HDR. It is ancient, and based on the even more ancient X11. But Wayland is supposed to be the new up and coming Xorg replacement, and it lacks HDR support as well. There are some other desktop environments being worked on that say they will support HDR, like COSMIC, but who knows what kind of traction they will gain, and when they will finally make it into mainstream state.

To be fair though, I neither watch movies on my linux desktop, nor play games under Linux, so I don't miss HDR there, but I know some people would.
 
  • Like
Reactions: t1k
like this
What??? Please turn in your gray beard immediately. Especially if you interact with servers.

I use the terminal constantly... it just comes down like a Quake console for me. Microsoft have finally gotten their act together and made a good terminal as well.
I mean I just spent a day in CMD and Powershell trying to cleanup and force remove some dead AD servers from Active Directory and cleaned up Exchange while I was in there. Then configured a half dozen new switches over a serial connection in Putty.
So command line alive and well for windows too.
 
I mean I just spent a day in CMD and Powershell trying to cleanup and force remove some dead AD servers from Active Directory and cleaned up Exchange while I was in there. Then configured a half dozen new switches over a serial connection in Putty.
So command line alive and well for windows too.

Indeed. Though I find Powershell much less user friendly than Linux. That might just be based on what I am used to though.
 
Indeed. Though I find Powershell much less user friendly than Linux. That might just be based on what I am used to though.
I would say less user friendly because once you know the basics it is very simple overall. But it has so many modules and functions.
Fantastic for scripting.
It becomes one of those things that you create a library of scripts for that you just call on for the annoying things.
Need to create 300 new user accounts, just feed it the csv and have it sync with O365. Boom, creates them in Exchange, sets their distribution groups, syncs with AD sets their permission groups, then starts a sync with O365 when complete.
Need to modify everybody’s licenses based on grouping script for that too.
Want to run system health checks it can use AD enrollment crawl the network pull the info then put it together in a color coded HTML file for easy digestion.

Powershell annoying AF for single line commands, but fantastic for scripts to do relatively complex repetitive tasks.
 
I’ve spent a little time recently in Powershell working with yubikey integration to wsl. It’s not amazing for what I’ve needed to do, but not bad. Would have been very useful to me in the early 2000s when I was doing a lot more with windows.
 
Indeed. Though I find Powershell much less user friendly than Linux. That might just be based on what I am used to though.
PowerShell rocks, it's generally very consistent because unlike Bash, everything's an actual object - not just text that you need some arbitrary tool to try to work with. People whine about how verbose it is (and it is) but you can tab complete _everything_.

Plus, if you can do it in .NET, you can do it in PowerShell. You can trivially call any part of the entire .NET API, and even PInvoke native code if you really want. Need to call parts of the Win32 API or C library for reasons beyond earthly logic? Sure - why not.

Not sure what an object has on it? Just Select-Object * it.
Code:
Get-Process | Select-Object * -Last 1

Want to filter what you get back? Where-Object and write a condition. Want to iterate over what you get back? ForEach-Object.
Want to delete every user in your AD domain and bomb your boss with an email for every account you kill?
Code:
$emailArgs = @{
    From    = "[email protected]"
    To      = "[email protected]"
    Subject = "cya dweeb"
}

Get-ADUser | ForEach-User {
    Send-MailMessage @emailArgs
    $_ | Remove-ADUser
}
 
Want to run an up date?

sudo apt update
sudo apt dist-upgrade
Or you can use update manager.
Want to install a package?

apt-cache search <string>
sudo apt install <package name>
Software Manager
Want to find installed packages?

apt list --installed |grep -i <string>
Synaptic Package Manager
Want to remove a package?

sudo apt remove <package>
Also Synaptic Package Manager.
Want to automatically clean up dependencies?

sudo apt autoremove
Stacer
Want to configure a package?

sudo nano /etc/<text config file> and edit commented text file
Gedit
Starting, and stopping services, mass renaming files, finding that lost file, copying/moving file repositories, using dd to image or restore a drive, etc. etc.
But my point is you don't need to use terminal and enter commands. There's already alternatives.
Everything is souvh faster and efficient from the command line. The GUI is slow and frustrating by comparison as you ahvr to sing through things and visually look for settings without the advantage of a search feature.
I agree and use commands all the time. Software Manager takes a while to load up, while entering a command to install software is much quicker.
 
I agree and use commands all the time. Software Manager takes a while to load up, while entering a command to install software is much quicker.

You know, I've been using Ubuntu/Mint since like 2006-2007 some time, when I switched to from Gentoo to Ubuntu. (Later in ~2011 when Ubuntu moved to Unity as the default desktop, I switched to Mint, where I have stayed since.)

In all that time I've never once installed any software from the Software Manager. I consider it to be a "n00b tool" and mostly wasteful bloat, just like most of the other GUI configuration tools they have slowly built in over the years.

When I need a new package, I go straight to the command line every time.

# sudo apt update
# apt-cache search <package search term>
# sudo apt install <package name>

(I've only recently trained my fingers to stop using apt-get, and replace it with just apt. Muscle memory is hard to change, man)

Dead simple. Much faster and more convenient than clicking through a stupid GUI "storefront."

I barely ever even open the software updater either. When its time for software updates, it is:

# sudo apt update
# sudo apt dist-upgrade (or just upgrade, but I usually use dist-upgrade)

I used to do almost all of my configuration from the command line. With if up/down it was dead simple to configure my network from the command line.

Just edit /etc/network/interfaces

Now they've made a mess of it with netplan, using some dumbass "yaml" notation that makes absolutely no sense.

Why on earth use either Yaml or XML when plain text just works? So they have forced me to use the GUI tools, which I kind of chafe at.

I'd argue Ubuntu and its derivatives peaked with the 14.04 LTS release. They used upstart instead of the terrible SystemD, still used if up/down, and they hadn't started pushing the bullshit snaps (or in Mint Flatpak) yet.

(Snaps., Flatpak and AppImage can die in a fire. I don't care if they make software developers lives easier. they are dumb, introduce bloat, and break the dependency tree and security.

Any given system should have 100% of its software in the one centeral package manager, or they shouldn't be included or used at all. I will die on this hill.)
 
Why on earth use either Yaml or XML when plain text just works?
Yaml/Xml are not binary they are pure text and editable in vim.

This one thing generative AI is quite nice for, very simple and standard small affair but in a unknown new notation/syntax.

Any given system should have 100% of its software in the one centeral package manager, or they shouldn't be included or used at all. I will die on this hill.)
Has someone that make a very niche software solution, sometime custom to the clients I would be a bit worry about this, it would need to be quite the flexible affair. What about the program I am currently making myself, should I need to upload it on a server, then download it via a central package manager just to run a test ? That sound a bit too much like some Apple system to me.
 
Yaml/Xml are not binary they are pure text and editable in vim.

This one thing generative AI is quite nice for, very simple and standard small affair but in a unknown new notation/syntax.

Has someone that make a very niche software solution, sometime custom to the clients I would be a bit worry about this, it would need to be quite the flexible affair. What about the program I am currently making myself, should I need to upload it on a server, then download it via a central package manager just to run a test ? That sound a bit too much like some Apple system to me.

Of course a developer should be able to compile and install their own binaries for testing (or for personal use) using "make && make install", or better yet assembling a .deb file (if on a Debian based distro) and manually unstalling it using Synaptic or Apt.

I'm talking by design for general use. Most packages used by most people should be in the repos for the distribution. Not everything will be, and that's why we have community sources and PPA's (though people need to do a MUCH better job of vetting PPA's and not just blindly installing them, as a malicious PPA can pretty much install anything on your system.)

One of the major benefits of most linux distributions when it comes to security is that central package manager, with your entire dependency tree that is maintained and patched for security by the distribution maintainer.

When you start bundling binaries with their dependencies in a binary blob (or even worse, statically compiling them), not only do you wind up with a ton of duplicated dependencies, which is highly inefficient resulting in bloat. You also have to trust every little package maintainer to always stay up to date and include the latest patched dependencies, which is a mess.

If there is a new zero day tomorrow affecting one of the dependencies on my system, I can check. Hey, what version of package X resolved this CVE? Do I have that version? Good!

In a Snap/FlatPak/AppImage world its just a shitshow of mixed dependency versions all over the place to keep track of. You never know (without a herculean effort) which versions of which dependencies are in each blob, and if they are a high enough revision to have the issue fixed, or if they are an earlier version with a fix backported. It's an unholy shitshow.

In either configuration you may have this corner case little package that rarely gets updated, but in the "everything in th epackage manager" scenario, its dependen cies will be patched and updated. In the "binary blob with included dependencies" scenario, a dependency with an unpatched zero day can linger unbeknownst to the user for years.

If I wanted Linux to be more like Windows, I would be using Windows.

I'd love to rewind the clock to Ubuntu 14.04 LTS (on the server, the Ubuntu GUI already sucked at that time) or Mint 17.3. That was the last time Linux was sane, before the "lets turn Linux into a shitty proprietary Windows knockoff" people started getting too much control.

I would like nothing more than to kill SystemD, revert back to upstart, get my "if up/down" back, and live in a world without Snaps/Flatpak (or AppImage)

That's not to say there havent been good developments in Linux as well. Wayland is very promising (but moving waaay(land) too slowly IMHO. It is time for X11/Xorg to die once and for all.

I'm all for good changes, and opposed to dumb ones.
 
Of course a developer should be able to compile and install their own binaries for testing (or for personal use) using "make && make install", or better yet assembling a .deb file (if on a Debian based distro) and manually unstalling it using Synaptic or Apt.

I'm talking by design for general use. Most packages used by most people should be in the repos for the distribution. Not everything will be, and that's why we have community sources and PPA's (though people need to do a MUCH better job of vetting PPA's and not just blindly installing them, as a malicious PPA can pretty much install anything on your system.)

One of the major benefits of most linux distributions when it comes to security is that central package manager, with your entire dependency tree that is maintained and patched for security by the distribution maintainer.

When you start bundling binaries with their dependencies in a binary blob (or even worse, statically compiling them), not only do you wind up with a ton of duplicated dependencies, which is highly inefficient resulting in bloat. You also have to trust every little package maintainer to always stay up to date and include the latest patched dependencies, which is a mess.

If there is a new zero day tomorrow affecting one of the dependencies on my system, I can check. Hey, what version of package X resolved this CVE? Do I have that version? Good!

In a Snap/FlatPak/AppImage world its just a shitshow of mixed dependency versions all over the place to keep track of. You never know (without a herculean effort) which versions of which dependencies are in each blob, and if they are a high enough revision to have the issue fixed, or if they are an earlier version with a fix backported. It's an unholy shitshow.

In either configuration you may have this corner case little package that rarely gets updated, but in the "everything in th epackage manager" scenario, its dependen cies will be patched and updated. In the "binary blob with included dependencies" scenario, a dependency with an unpatched zero day can linger unbeknownst to the user for years.

If I wanted Linux to be more like Windows, I would be using Windows.

I'd love to rewind the clock to Ubuntu 14.04 LTS (on the server, the Ubuntu GUI already sucked at that time) or Mint 17.3. That was the last time Linux was sane, before the "lets turn Linux into a shitty proprietary Windows knockoff" people started getting too much control.

I would like nothing more than to kill SystemD, revert back to upstart, get my "if up/down" back, and live in a world without Snaps/Flatpak (or AppImage)

That's not to say there havent been good developments in Linux as well. Wayland is very promising (but moving waaay(land) too slowly IMHO. It is time for X11/Xorg to die once and for all.

I'm all for good changes, and opposed to dumb ones.

I'm on the Snap/Flatpak are shit train, but what's wrong with systemd?
 
I'm on the Snap/Flatpak are shit train, but what's wrong with systemd?

Many people with way more expertise on the subject than I have have addressed it ad nauseum over the years, so I am just going to cite the defunct boycottsystemd.org site:

  • systemd flies in the face of the Unix philosophy: "do one thing and do it well," representing a complex collection of dozens of binaries. Its responsibilities grossly exceed that of an init system, as it goes on to handle power management, device management, mount points, cron, disk encryption, socket API/inetd, syslog and other things.
  • systemd's journal files (handled by journald) are stored in a complicated binary format, and must be queried using journalctl. This makes journal logs potentially corruptable. Oh, an embedded HTTP server is loaded to read them. QR codes are served, as well.
  • systemd's team is noticeably anti-Unix, due to their open disregard for non-Linux software and subsequent systemd incompatibility with all non-Linux systems. Since systemd is very tightly welded with the Linux kernel API, this also makes different systemd versions incompatible with different kernel versions. This is an isolationist policy that essentially binds the Linux ecosystem into its own cage, and serves as an obstacle to software portability.
  • udev and dbus are forced dependencies. In fact, udev merged with systemd a long time ago.
  • By default, systemd saves core dumps to the journal, instead of the file system. Core dumps must be explicitly queried using systemd-coredumpctl. Besides going against all reason, it also creates complications in multi-user environments (good luck running gdb on your program's core dump if it's dumped to the journal and you don't have root access), since systemd requires root to control. It assumes that users and admins are dumb.
  • systemd's size makes it a single point of failure. As of this writing, systemd has had 9 CVE reports, since its inception in March 2010. So far, this may not seem like that much, but its essential and overbearing nature will make it a juicy target for attacks, as it is far smaller in breadth than the Linux kernel itself, yet seemingly just as critical.
  • systemd is viral by its very nature. Its scope in functionality and creeping in as a dependency to lots of packages means that distro maintainers will have to necessitate a conversion, or suffer a drift. As an example, the GNOME environment has adopted systemd as a hard dependency since 3.8 for various utilities, including gdm, gnome-shell and gnome-extra-apps. This means GNOME versions >=3.8 are incompatible with non-Linux systems, and due to GNOME's popularity, it will help tilt a lot of maintainers to add systemd. The rapid rise in adoption by distros such as Debian, Arch Linux, Ubuntu, Fedora, openSUSE and others shows that many are jumping onto the bandwagon, with or without justification. It's also worth noting that systemd will refuse to start as a user instance, unless the system boots with it as well - blatant coercion.
  • systemd clusters itself into PID 1. Due to it controlling lots of different components, this means that there are tons of scenarios in which it can crash and bring down the whole system. But in addition, this means that plenty of non-kernel system upgrades will now require a reboot. Enjoy your new Windows Linux system! In fairness, systemd does provide a mechanism to reserialize and reexecute systemctl in real time. If this fails, of course, the system goes down. There are several ways that this can occur. This happens to be another example of SPOF.
  • systemd is designed with glibc in mind, and doesn't take kindly to supporting other libcs all.
  • systemd's complicated nature makes it harder to extend and step outside its boundaries. While you can more or less trivially start shell scripts from unit files, it's more difficult to write behavior that goes outside the box, what with all the feature bloat. Many users will likely need to write more complicated programs that directly interact with the systemd API, or even patch systemd directly.
  • Ultimately, systemd's parasitism is symbolic of something more than systemd itself. It shows a radical shift in thinking by the Linux community. Not necessarily a positive one, either. One that is vehemently postmodern, monolithic, heavily desktop-oriented, choice-limiting, isolationist, reinvents the flat tire, and is just anti-Unix in general. If your goal is to pander to the lowest common denominator, so be it. We will look for alternatives, however.

Essentially, it's a bloated single point of failure that tries to do everything, and through its architecture paints distributions and open source developers into a corner. Open source software developers pretty much need to make it a dependency, and when they do, those distributions that choose not to use systemD for whatever reason are locked out of that software. It also breaks long standing norms of compatibility and inter-usability between linux and unix variants.

You used to be able to run Linux on pretty much anything no matter how old. System resources (drive space used, ram needed etc.) used to be super light for Linux, unless you intentionally chose packages that were heavy. I remember installing Linux with a light window manager (like Xfce or LMDE) on ancient machines that could barely run modern Windows releases, and they were snappy as all hell. With SystemD the bloat is real. It has gone a long way to harm the lightness and efficiency of Linux.

It represents turning Linux into something more like windows, in a world where most people who use linux and who are passionate about linux do so in large part because they don't like Windows.

It's mostly a moot point now though. This was an important argument back in ~2014. At this point the damage is (probably irreversably) done. Most of the things they warned about have become reality.
 
Last edited:
You don’t even need an ancient system to see the impact of systemd’s footprint. Grab a 1GB pi 4 and install raspbian light. A noticeable fraction of ram is tied up right on boot, and it’s exceptionally hard to lighten it up.
 
Many people with way more expertise on the subject than I have addressed it ad nauseum over the years, so I am just going to cite the defunct boycottsystemd.org site:

  • systemd flies in the face of the Unix philosophy: "do one thing and do it well," representing a complex collection of dozens of binaries. Its responsibilities grossly exceed that of an init system, as it goes on to handle power management, device management, mount points, cron, disk encryption, socket API/inetd, syslog and other things.

Essentially, it's a bloated single point of failure that tries to do everything, and through its architecture paints distributions and open source developers into a corner. Open source software developers pretty much need to make it a dependency, and when they do, those distributions that choose not to use systemD for whatever reason are locked out of that software. It also breaks long standing norms of compatibility and inter-usability between linux and unix variants.

You used to be able to runLinux on pretty much anything no matter how old. System resources (drive space used, ram needed etc.) used to be super light for Linux, unless you intentionally chose packages that were heavy. I remember installing Linux with a light window manager (like Xfce or LMDE) on ancient machines that could barely run Windows, and they were snappy as all hell. With SystemD the bloat is real. It has gone a long way to harm the lightness and efficiency of Linux.

It represents turning Linux into something more like windows, in a world where most people who use linux and who are passionate about linux do so in large part because they don't like Windows.

It's mostly a moot point now though. This was an important argument back in ~2014. At this point the damage is (probably irreversably) done. Most of the things they warn about have become reality.

I've seen this argument, and I don't get it.

systemd is a collection of tools under a single umbrella, it's not a single binary monolith. This is like saying coreutils flies in the face of Unix philosophy. There's like literally hundreds of packages under the Gnu umbrella.
 
Many people with way more expertise on the subject than I have have addressed it ad nauseum over the years, so I am just going to cite the defunct boycottsystemd.org site:

  • systemd flies in the face of the Unix philosophy: "do one thing and do it well," representing a complex collection of dozens of binaries. Its responsibilities grossly exceed that of an init system, as it goes on to handle power management, device management, mount points, cron, disk encryption, socket API/inetd, syslog and other things.
  • systemd's journal files (handled by journald) are stored in a complicated binary format, and must be queried using journalctl. This makes journal logs potentially corruptable. Oh, an embedded HTTP server is loaded to read them. QR codes are served, as well.
  • systemd's team is noticeably anti-Unix, due to their open disregard for non-Linux software and subsequent systemd incompatibility with all non-Linux systems. Since systemd is very tightly welded with the Linux kernel API, this also makes different systemd versions incompatible with different kernel versions. This is an isolationist policy that essentially binds the Linux ecosystem into its own cage, and serves as an obstacle to software portability.
  • udev and dbus are forced dependencies. In fact, udev merged with systemd a long time ago.
  • By default, systemd saves core dumps to the journal, instead of the file system. Core dumps must be explicitly queried using systemd-coredumpctl. Besides going against all reason, it also creates complications in multi-user environments (good luck running gdb on your program's core dump if it's dumped to the journal and you don't have root access), since systemd requires root to control. It assumes that users and admins are dumb.
  • systemd's size makes it a single point of failure. As of this writing, systemd has had 9 CVE reports, since its inception in March 2010. So far, this may not seem like that much, but its essential and overbearing nature will make it a juicy target for attacks, as it is far smaller in breadth than the Linux kernel itself, yet seemingly just as critical.
  • systemd is viral by its very nature. Its scope in functionality and creeping in as a dependency to lots of packages means that distro maintainers will have to necessitate a conversion, or suffer a drift. As an example, the GNOME environment has adopted systemd as a hard dependency since 3.8 for various utilities, including gdm, gnome-shell and gnome-extra-apps. This means GNOME versions >=3.8 are incompatible with non-Linux systems, and due to GNOME's popularity, it will help tilt a lot of maintainers to add systemd. The rapid rise in adoption by distros such as Debian, Arch Linux, Ubuntu, Fedora, openSUSE and others shows that many are jumping onto the bandwagon, with or without justification. It's also worth noting that systemd will refuse to start as a user instance, unless the system boots with it as well - blatant coercion.
  • systemd clusters itself into PID 1. Due to it controlling lots of different components, this means that there are tons of scenarios in which it can crash and bring down the whole system. But in addition, this means that plenty of non-kernel system upgrades will now require a reboot. Enjoy your new Windows Linux system! In fairness, systemd does provide a mechanism to reserialize and reexecute systemctl in real time. If this fails, of course, the system goes down. There are several ways that this can occur. This happens to be another example of SPOF.
  • systemd is designed with glibc in mind, and doesn't take kindly to supporting other libcs all.
  • systemd's complicated nature makes it harder to extend and step outside its boundaries. While you can more or less trivially start shell scripts from unit files, it's more difficult to write behavior that goes outside the box, what with all the feature bloat. Many users will likely need to write more complicated programs that directly interact with the systemd API, or even patch systemd directly.
  • Ultimately, systemd's parasitism is symbolic of something more than systemd itself. It shows a radical shift in thinking by the Linux community. Not necessarily a positive one, either. One that is vehemently postmodern, monolithic, heavily desktop-oriented, choice-limiting, isolationist, reinvents the flat tire, and is just anti-Unix in general. If your goal is to pander to the lowest common denominator, so be it. We will look for alternatives, however.

Essentially, it's a bloated single point of failure that tries to do everything, and through its architecture paints distributions and open source developers into a corner. Open source software developers pretty much need to make it a dependency, and when they do, those distributions that choose not to use systemD for whatever reason are locked out of that software. It also breaks long standing norms of compatibility and inter-usability between linux and unix variants.

You used to be able to runLinux on pretty much anything no matter how old. System resources (drive space used, ram needed etc.) used to be super light for Linux, unless you intentionally chose packages that were heavy. I remember installing Linux with a light window manager (like Xfce or LMDE) on ancient machines that could barely run Windows, and they were snappy as all hell. With SystemD the bloat is real. It has gone a long way to harm the lightness and efficiency of Linux.

It represents turning Linux into something more like windows, in a world where most people who use linux and who are passionate about linux do so in large part because they don't like Windows.

It's mostly a moot point now though. This was an important argument back in ~2014. At this point the damage is (probably irreversably) done. Most of the things they warn about have become reality.
I avoided systemd from the start of my use of Linux. It just seem so against the standard unix philosophy. I used Devuan Linux for a few years then switched to MX Linux. I tried systemd with MX Linux (accidentally enabled it!) but it had an issue with my Windows VM that I didn't like. It wouldn't bring down VMs GPU in a timely manner when I stopped the VM. It would take minutes for GPU to shut down, the non-systemd system would shut it down almost immediately.
 
12Lite anyone? :-P

As for CLI it seems I'm using BASH more in MacOS than I do in Win on the desktop!
 
All Windows threads devolve into Linux circlejerking. It's as inevitable as death.
Right. Been using linux since 1994 every day.(still feel like it has to be said in these threads) Still don't want it as a primary desktop at the house. I'll happily move to win12 w/o bitching and moaning for 2-4 years about how bad it sucks.
Maybe im just getting old. Just work, experiment all day with work in *nix. Don't really care to do it anymore at the house outside of my server/plex/container stuff, which is pretty steady state at this point thanks to automation, config mgmt.
 
Alright, let's bring it back on topic then.

Lots of Windows users on the Hardforums.

Does anyone actually want a floating task bar, a focus on AI and potentially even Co-Pilot replacing the start menu.

Like, even one person?

I guess my take is that the system is broken. In a free market, the customer is supposed to run the show, and the businesses are supposed to fight amongst themselves over who can best satisfy what the customer wants.

In the tech world - however - it has seemingly become a battle over who can shove the most unwanted junk down consumers throats.

Some might argue that they need to educate the users about what they actually want, and once they have gotten used to it, they will find it indispensable, because the tech giants know better, but I highly doub't that.

The truth is they dont give a rats ass what users want. They have an agenda, and they are going to push it come hell or high water.

The theory goes that this is where a new competitor steps in and gives the consumer what they want, and the big lumbering giant that has Igbored the consumers wishes fails and eventually goes bankrupt. But I think we are in a period where the barriers to entry are too high, and thus all rules of free markets no longer apply.

It'd probably time to break them all up. Microsoft, Google, Facebook, Apple, Amazon, Nvidia, heck, even Intel and AMD. Every last one of them. None should be left standing. Go trustbuster on their asses! It's time to completely upend the entire tech sector, until things get back to the way they are supposed to be, where we sing, and they dance, and worship our every step to get our hard earned cash.
 
Last edited:
Alright, let's bring it back on topic then.

Lots of Windows users on the Hardforums.

Does anyone actually want a floating task bar, a focus on AI and potentially even Co-Pilot replacing the start menu.

Like, even one person?
I'm sure they're out there. Kinda knew a long time ago that MS was never going to release a windows version I liked nor make changes I'd be interested in.
 
Does anyone actually want a floating task bar, a focus on AI and potentially even Co-Pilot replacing the start menu.

Like, even one person?
i dont care about the taskbar, i auto-hide mine, but i certainly do not want the ai bullshit.
someone at MS does.
 
Does anyone actually want a floating task bar, a focus on AI and potentially even Co-Pilot replacing the start menu.

Like, even one person?
Well almost everyone will prefer to just tell their computer in natural language what to do like every Sci-fi movie predict will happen, never bet against the most natural way to do things, nice text command line crushed switch and lights, mouse and GUI did beat command line in popularity, phone-tablet finger beat mouse in popularity and so on.

telling your computer out loud: Every time I receive an email from my colleague John do X,y z, from client Factory of Tire do A,B,C, instead of learning a scripting language or googling to find ways to do it via menus, yes if it works well it will be popular. Saying outloud or typing: when my computer boot I never want Norton antivirus to launch, many will love that.
 
I'm down for a floating taskbar as long as it works the way it should and isn't 1/2 broken.
 
Alright, let's bring it back on topic then.

Lots of Windows users on the Hardforums.

Does anyone actually want a floating task bar, a focus on AI and potentially even Co-Pilot replacing the start menu.

Like, even one person?

I guess my take is that the system is broken. In a free market, the customer is supposed to run the show, and the businesses are supposed to fight amongst themselves over who can best satisfy what the customer wants.

In the tech world - however - it has seemingly become a battle over who can shove the most unwanted junk down consumers throats.

Some might argue that they need to educate the users about what they actually want, and once they have gotten used to it, they will find it indispensable, because the tech giants know better, but I highly doub't that.

The truth is they dont give a rats ass what users want. They have an agenda, and they are going to push it come hell or high water.

The theory goes that this is where a new competitor steps in and gives the consumer what they want, and the big lumbering giant that has Igbored the consumers wishes fails and eventually goes bankrupt. But I think we are in a period where the barriers to entry are too high, and thus all rules of free markets no longer apply.

It'd probably time to break them all up. Microsoft, Google, Facebook, Apple, Amazon, Nvidia, heck, even Intel and AMD. Every last one of them. None should be left standing. Go trustbuster on their asses! It's time to completely upend the entire tech sector, until things get back to the way they are supposed to be, where we sing, and they dance, and worship our every step to get our hard earned cash.
All I want is for Windows to have a universally consistent UI. I love how they _still_ have old ass dialogs all over the place.

Modern Microsoft absolutely cannot be trusted near UIs.

Also yes, I think there is a big market for someone to be able to literally say what's wrong and potentially get help. Plus GPT-4 can parse images.

There's a non zero chance this could literally look at what you're doing once you summon its aid and help you.

I don't want it or need it, but I can see the vision
 
Last edited:
Well almost everyone will prefer to just tell their computer in natural language what to do like every Sci-fi movie predict will happen, never bet against the most natural way to do things, nice text command line crushed switch and lights, mouse and GUI did beat command line in popularity, phone-tablet finger beat mouse in popularity and so on.

telling your computer out loud: Every time I receive an email from my colleague John do X,y z, from client Factory of Tire do A,B,C, instead of learning a scripting language or googling to find ways to do it via menus, yes if it works well it will be popular. Saying outloud or typing: when my computer boot I never want Norton antivirus to launch, many will love that.
Maybe, I feel like talking is just inherently slower, there's also processing time. If I can have it build gui scripts for me or shell scripts that I want to run then sure that's cool, there's still times where I just want to click on things. Rather than "Ciri open fire fox and go to youtube.com". Also, you're making an assumption it won't get gated and will actually do anything I want it to do (like the norton thing) which I doubt it will.
 
Maybe, I feel like talking is just inherently slower, there's also processing time. If I can have it build gui scripts for me or shell scripts that I want to run then sure that's cool, there's still times where I just want to click on things. Rather than "Ciri open fire fox and go to youtube.com". Also, you're making an assumption it won't get gated and will actually do anything I want it to do (like the norton thing) which I doubt it will.

That's you, though.

What is the starting point for some old ass grandma who looks like she crawled out of Tutankhamen's tomb? Even if you somehow got the start menu open and started typing "connect to the Internet," at that point you've confused search and you're in the no man's land that is Bing. Eventually we'll be at the point where we could have a fallback that works locally in these cases.

Then it gets you over that hump and online. Literally asking "how do I message my grandson." Chances are you'll get a nice list - email, messaging apps like WhatsApp, Facebook, whatever. Email sounds right.

"How do I send an email?" Ideally you'll get walked through setting up an account. Maybe it'll ask your username, it can infer the provider and drop you at the point where you're just putting in your password.

AI is great at actually parsing something meaningful and walking its way to a solution. You can truly know absolutely nothing and you can still get somewhere with relative ease. It won't get frustrated or roll its eyes. It will happily try to explain anything and everything you ask. And you can legitimately talk to it in a natural way.

And they're usually pretty good at figuring out whatever cryptic horse shit you asked it. Your terminology or spelling can be a bit off and there's a good chance it'll manage.

Once it has some context, you can even be vague as hell and it'll still figure things out. "Screenshot" for example will probably immediately tell you what to do. The first hit on Google is a Wikipedia article. This sounds like a minor thing, but will probably make a huge difference.

The typical person is fucking terrible at searching for information.
 
Last edited:
it won't get gated and will actually do anything I want it to do (like the norton thing) which I doubt it will.
That has to be seen how much they will want and there could be a battle between what Microsoft want and the HP-Dell of the world. But that something that do is just a big positive for Microsoft in a vacuum here, they win nothing (directly) from partners making money with bloatware and I do not remember ever blocking script that uninstall them.

I feel like talking is just inherently slower, there's also processing time. If I can have it build gui scripts for me or shell scripts that I want to run then sure that's cool, there's still times where I just want to click on things.
Starting to type the first letter of what you want and pressing enter will still be the way to go for the simplest things like launching an application (or having the most common one icon in a launch zone), AI will be for more complicated affair, the would have maybe use for the rare time going into and browsing some start startmenu because I do not know the name of what I want to do.
 
Starting to type the first letter of what you want and pressing enter will still be the way to go for the simplest things like launching an application (or having the most common one icon in a launch zone), AI will be for more complicated affair, the would have maybe use for the rare time going into and browsing some start startmenu because I do not know the name of what I want to do.

This is exactly where it's going to be good.

Sometimes people don't even know what they're trying to do. They know the vague goal, but can't articulate it/break it down into anything that would help them get there.

"Message grandson" is a legitimate thing to ask AI in this context and absolutely useless to throw at a search engine.
 
Back
Top