We’d like to remind Forumites to please avoid political debate on the Forum.
This is to keep it a safe and useful space for MoneySaving discussions. Threads that are – or become – political in nature may be removed in line with the Forum’s rules. Thank you for your understanding.
📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!
Windows XP support ended - day 1
Comments
-
"This is my point. if you're going to develop something make it better, don't make it fat like half of the UK population, make it lean, more efficient, an actual bang for your buck that justifies the start from scratch money layout these new OS's require everything from the corporation to the private user to find"
That argument is so tired it's out shopping for a double bed.
The first Unix I used was sixth edition, running on a pdp11/34. Sixth edition didn't need split I&D, so it ran with the kernel fitting into 64 KB covering both the text (operating system instructions) and data. Each individual program that you ran had to fit into another 64KB lump of RAM. The 11/34 wouldn't run 7th Edition, because that required split I&D which meant buying a new computer (an 11/70, yours for a mere thirty grand).
Today, even a relatively stripped Linux kernel is a 3MB text segment, with about the same requirement for data, and you would be ambitious to run Linux, even doing a job you could manage on sixth edition, in less than about 16MB.
However, 16GB of RAM is, what, about £100? So if you could, by some miracle, cut the Linux kernel down so that it ran in 0.06MB rather than 16MB, it would make the computer about 10p cheaper. Don't spend it all at once. In reality, of course, it would do no such thing, because memory isn't packaged like that: it would just free up an additional 0.1% of RAM for use by userspace processes. Again, don't spend it all at once. But that's not bloat, as some greybeards would have you believe (my beard is, when I grow it, grey enough to make me start thinking like that). It's networking (IPv4 and IPv6), it's sound, it's decent observability and debugging capability, it's volume management, it's decent schedulers, it's journalling and snap-shotting filesystems, it's video, it's a whole stack of things that are worth a great deal more than 10p.
Too many people pick a golden age and attempt to argue that everything since then is needless bloat. I'm prone to it myself, and attempting to wean undergraduates off Eclipse and telling them you can do fast and effective software development with ed on a Raspberry Pi doesn't go down too well. But plenty of people argued that each of the features I named above was unnecessary, and they were largely, hindsight tells us, wrong. There's stuff in modern kernels that is more than worth its corn, and more than worth 10p of RAM to run it in. It's ludicrous to suggest that we should worry about using 10p of resources.0 -
If you're adding 2 numbers up in a spreadsheet, something that uses up very few bytes of machine code as it always did, the vast amount of corn thrashing going on under the hood isn't important, it's bloat.
Sound, networking, video, mouse pointers, windows etc etc have been around a long time. Processors, disks, memory has improved vastly over the last few decades, but the experience of speed in normal day to day tasks and boot times hasn't really, unless you happen to use an ssd, because the os now does a whole host of things it doesn't need to.
The more bloat, the more security holes, the more patches, the more bloat, GOTO start!!
> . !!!! ----> .0 -
If you're adding 2 numbers up in a spreadsheet, something that uses up very few bytes of machine code as it always did, the vast amount of corn thrashing going on under the hood isn't important, it's bloat.
Is your argument about size or the number of instructions being executed? Operating Systems have a lot more functionality, but if it's not being used, then all it's doing is sitting there. The microkernel wars are essentially over and we're left with monolithic large kernels, so if it's kernel side, it's sat there in RAM, but RAM is cheap (and the volumes are tiny: if you compile LInux without module support and include every single facility, I think it ends up at around 6MB). Using an extra 6MB of RAM does not slow a computer down. If the facility isn't in the kernel, it's sat there on disk unused, and that _really_ doesn't slow a computer down.Sound, networking, video, mouse pointers, windows etc etc have been around a long time.
That's hardly the point. I've used machines with none of them, in an era when only one of them was available full stop (networking) outside research labs. Again, people pick their golden age and claim that all change after that point is decay. Volume management only arrived in consumer operating systems within the last couple of years (Core Storage in OSX, the various Logical Volume Manager derivatives in Windows). They are major, invasive changes to the kernel, with large user-side changes as well, and are utterly essential as volumes of data continue to grow.Processors, disks, memory has improved vastly over the last few decades, but the experience of speed in normal day to day tasks and boot times
But people don't (or shouldn't) be rebooting machines, and sleep/hibernate is all new. I don't reboot may laptop from one month's end to the next (normally only for a kernel update). And the elephant in the room is disk read performance, which scales as to the square of capacity multiplied by the (small) increase in rotational speed we've seen over the past thirty years, but with the limiting factor of seek performance, which has improved from ~25ms in the 1980s to ~6ms today, which is barely worth having. Which is why all those fancy filesystems you're so quick to dismiss matter so much.hasn't really, unless you happen to use an ssd, because the os now does a whole host of things it doesn't need to.
So run a stripped-out Linux on a Raspberry Pi. I do software development on a couple, they're fine. Your choice.The more bloat, the more security holes, the more patches, the more bloat, GOTO start
I assume you're doing development work on modern operating systems in order to tell us all where we're going wrong?0 -
yes that's exactly what I'm doing - it's clear your knowledgeable yourself so tell me is a piece of bad code noticed less inside an enormous room or is it noticed more inside a compact sleek routine?securityguy wrote: »Too many people pick a golden age and attempt to argue that everything since then is needless bloat. I'm prone to it myself, and attempting to wean undergraduates off Eclipse and telling them you can do fast and effective software development with ed on a Raspberry Pi doesn't go down too well. But plenty of people argued that each of the features I named above was unnecessary, and they were largely, hindsight tells us, wrong. There's stuff in modern kernels that is more than worth its corn, and more than worth 10p of RAM to run it in. It's ludicrous to suggest that we should worry about using 10p of resources.
I do take on board your comments, but I am a golden age kind of guy, I harken back to what Braben did with Elite fitting that much code into that little memory - the space saving ideas he came up with were as ingenious as they were elegant, you wouldn't get that kind of technique in modern counterparts quite simply because if you paint a room a wrong colour inside your routine, you just move to the next room and leave it, you don't necessarily have to work out 1. why and 2. how to overcome it.
I've got 7 on here myself because I'm a lazy sod and because I don't want strife with my internet banking - but that's their side forcing my hand not mine. I have a virtual XP and various forms of linux, but when I write routines in windows , I see this day in day out the possibility to just throw your hands in the air, grab another bit of memory and start again, I'd hazard a good guess I was a better programmer in the days of the BBC than I am now
why is all this relevant, well - because the answer to the problem wasn't more RAM, the problem surely is - how can we effectively use the vast sizes of RAM we have at our disposal? But as far as the OS's go, they very much take what they can get rather than innovate, it's far more the drive through mcdonalds rather than the thoughtful combining of ingredients.0 -
I would sit on the fence and wait to see what the Chinese come up with if their replacement team is found wanting...lazer:
Do you really think it's economically viable for them to keep XP up to date? AND Vista? AND 7? AND 8?
and again, it's consistently money - if a superpower nation decides it wants XP support, then it becomes economically viable again.0 -
andydiysaver wrote: »I do take on board your comments, but I am a golden age kind of guy, I harken back to what Braben did with Elite fitting that much code into that little memory - the space saving ideas he came up with were as ingenious as they were elegant, you wouldn't get that kind of technique in modern counterparts quite simply because if you paint a room a wrong colour inside your routine, you just move to the next room and leave it, you don't necessarily have to work out 1. why and 2. how to overcome it.
Braben (and Bell, don't forget him) also had to go over, by hand, the names of every star system that game generated, making sure that none of them were rude words. They tried many different algorithms until the game churned out 8 galaxys that were actually playable and they still ended up with a situation where using a galactic hyperspace in the wrong place dumped you on a planet that was out of range from anything else with no way to ever leave it.
For an 80s computer game that is fine, the consequences of that screw up is someone has to reload an earlier save and loses an hours progress in an ultimately inconsequential game.
When you're not dealing with Commander Jameson but someone's real identity, e.g. their online banking or whatever, you don't do that kind of trickery. Keep it simple even if doing so makes it bigger and less efficient, and if something does go wrong, the poor sod who has to fix your mistake might have a better chance of figuring out what you were trying to do and fixing it quickly. It'd likely take months of studying to fully understand what Braben's code is doing, which is undesirably if your company is offline, or leaking data to Russian hackers, the entire time.
This is why we now have huge managed code frameworks (e.g. dotnet) that some claim are huge and bloated, it stops developers from rolling their own implementations of things that are already solved problems that have security implementations if they are done badly (e.g. date/time functions), it's why OSes are providing more and more services and you know some of that bloat is actually security related (e.g. ASLR)0 -
andydiysaver wrote: »I harken back to what Braben did with Elite fitting that much code into that little memory
I've met Braben a couple of times: nice guy. And he's the first to point out the bugs and difficulty in developing Elite, and the immense amount of time they had to spend on engineering it into the space (ho ho) rather than focusing on the game itself.
You need to make your mind up: are you hankering after simple, clean implementations that can be debugged and reviewed, in which case you want managed frameworks, clean libraries and good observability. Or do you want cryptic, complex code in which every byte has to earn its place twice in order to solve a problem (restricted memory) that doesn't exist any more? I've written in very tight environments, far tighter than the 32K luxury of the BBC model B: the whole experience is painful and the quality of the code is shocking. I simply don't understand why people would want to return to those days, nor why anyone would think software quality was better.
I recently had cause to read the source code for the 7th Edition implementation of "sort". It uses an emulated polyphase tape sort to order files which are too large to fit into memory (remember, 7th edition was a swapping kernel which restricted processes to 64KB of text and 64K of data, while files could be much larger). It's horrific. It's not just that the algorithm is complex, because I'm a big boy and I can cope. But the actual code itself is shocking, using all sorts of nasty tricks to economise on every byte. And as it's completely uncommented, actually understanding how it works is a non-trivial task. As Lum points out, having to analyse how a horror show like that works while your site is under attack is beyond imagining, and the idea that the practices in that sort of code (a response to the difficulties of the era, although the lack of commenting is just macho programming) have a place today is ludicrous.
In the 1990s I was involved in a project to recompile a Unix distribution (SVR3, I think: we had a license and stuff) with the GNU C compiler, and we found that amongst other nightmares, the code assumed that if you declare two arrays successively and then index off the end of one you simply get the beginning of the next, and worse it actually relied on this (we found this, because gcc happens to lay out memory the other way around). It wasn't a bug, it was deliberate. We found code that relied on the layout of the stack frame and broke if the alignment changed. We found code that made hideous assumptions about the order of evaluation of argument lists which is explicitly undefined in C. That golden age? Carp.0 -
right , I'll try and be clearsecurityguy wrote: »
You need to make your mind up: are you hankering after simple, clean implementations that can be debugged and reviewed, in which case you want managed frameworks, clean libraries and good observability. Or do you want cryptic, complex code in which every byte has to earn its place twice in order to solve a problem (restricted memory) that doesn't exist any more?
back in the age of the BBC- pre XP, you've got your tricks because you needed your tricks due to the hardware limitation. To say I admired the tricks isn't the same thing as saying the tricks are relevant today
....because of course they don't have to be. We've gone from trying to fit a size 30 into a size 8, to trying to fit a size 8 into a size 30, thinking ah damn that didn't work, next size 8 doesn't matter, he/she'll/it'll fit, we've got that jacamo xxxxxl thingy on order, so if plan A doesn't work, plan B , C , D....Z are still ok within roughly the same space
Also, to clarify my point, I think XP represented a far more sane balance than both those which went before it , and the bloated things that came (and will come )after it
so my mind is made up - we've gone from one extreme to the other!0 -
"I think XP represented a far more sane balance than both those which went before it , and the bloated things that came (and will come )after it"
And perhaps you thought that in 2001, too. But a lot of people didn't, and it's interesting to see that the beloved Windows XP that is the perfect balance was seen in a very different light when it shipped.
The Observer ("Why we should chuck XP out the Windows"), October 2001 was particularly outraged:So why are all these idiots in computer stores drooling at the prospect? Answer: because Windows XP is a monstrous, bloated brute that requires a state-of-the- art PC and two gigabytes of hard disk space before it will even say 'hello'. This means any consumer foolish enough to want to run XP will probably have to buy a new PC.
At a time when sales have stagnated, this is great news for the hard-pressed computer industry. So trebles all round for the suits in PC World, Currys and the like?
Er, possibly not. For one thing, XP is being launched into a world now sinking into recession, which means corporate IT managers may not take kindly to the notion of having to order hundreds of new PCs simply to run a version of Excel with cooler graphics when their users are perfectly happy with the old, uncool version.
Here's an excellent summary of people claiming that every single release of Windows is bloated and full of unnecessary features.0 -
There is a large disconnect between what the industry want to push on the public, and what they actually need. You only have to look at the rapidly increasing size of microsoft products to understand that they are becoming bloated. Windows soon went from floppy disc to cd to dvd, yet functionally, you can do the same things more or less on windows 95 as on W8.
The purpose of an operating system is to run applications.!!
> . !!!! ----> .0
This discussion has been closed.
Confirm your email address to Create Threads and Reply
Categories
- All Categories
- 352.3K Banking & Borrowing
- 253.6K Reduce Debt & Boost Income
- 454.3K Spending & Discounts
- 245.3K Work, Benefits & Business
- 601.1K Mortgages, Homes & Bills
- 177.5K Life & Family
- 259.2K Travel & Transport
- 1.5M Hobbies & Leisure
- 16K Discuss & Feedback
- 37.7K Read-Only Boards
