Software upgrades in general

Software QA is largely a thing of the past. Now it's a rush to release to meet sales/marketing deadlines, and have users do the final testing, even for "stable" releases and not just "public betas," which have contributed to the lowering of standards, and expectations (thanks, Google!).
What are you basing this opinion on? A few quick searches on LinkedIn results in thousands of open QA positions.

The problem is poor release planning, rigid methodologies that adhere to near-religious levels of procedural dogma, developers who cut corners, and management that fails to set priorities and that over scopes. Deadlines are not inherently a problem, they just appear that way when poorly executing teams fumble across them.

As a user, one can try to file bug reports, but to see them dismissed, or ignored in the tracker doesn't encourage keeping up the effort for all but the most dedicated (or annoyed with a particular bug).
Generally if this happens it’s due to at least one of four reasons:

1. No reproducible workflow
2. Specific to that user and not others
3. Impossible to safely fix
4. Other more important bugs are prioritized higher

Most software is mature, so a lot of new features aren't essential, making security the biggest reason to update (outside of a manfuacturer's push for new revenue generation though mandatory updates). Many users don't, but the risk/reward ratio still usually works in their favor, as long as aren't part of a targeted group, or engage in riskier behavior.
Woah, how did you get this impression? There are thousands of new pieces of software coming out constantly.
 
This happens a few times with Win11. After an operating system "upgrade", some things no longer work as before. I just walk back the upgrade and continue on. I've had a few OS upgrades since and it's been transparent.

My Win11 PC will not run older software suites unless you upgrade to subscription services. I keep my old Win7 Pro PC for when I need to run software that I already paid a perpetual license for (specifically Adobe applications).

I have a switch so I can simply push a button and switch between both PCs while using one KB, mouse and monitor.


There's probably a more technically elegant way to do this but it works for me. If I need to move files from one PC to the other, that's what thumb drives are for.
 
This morning, I got a popup as I was pulling some video from the dash cam micro SD, new version of the VLC media player available.

To be fair, I'm on Win7 Pro on this older laptop.

I download and install, it no longer works, at all. Crashes every time it's launched.

What was very cool is they have their archive available and so I thought back and know that I got this dash came around March 2023. I uninstalled, and installed the version from March and all is good.

Yes, I should have remembered that this machine is running Win7 (I declined a free upgrade on purpose--I have BMW software installed that does not run on Win10), so I should have not installed.

But, software in general that I can see from work, generally is full of bugs, yet there is always this pressure to upgrade to the latest/greatest. And when we do, problems.

I also think developers are under the gun to churn out new and higher numbers and revisions etc., and like everything, the quality is questionable.

Going all the way back to the above, for some reason, I could not play the videos from the dash cam the first time I tried. I searched online and VLC seemed to be a good player to choose, and it worked. And I've used it since....

Any good/bad software upgrade stories? I did brick a brand new Zebra ET56 tablet once by shutting it off when it was doing an update, had to send it back as it had both warranty and contract coverage...
You are commenting they did not test an unsupported and long past end of life OS with a software application? Our attitude is good luck and if you want pay us to support outdated Os we are all for it!

I would keep your BmW software as single use case of that laptop or boot instance . Don’t muddle it with other complexity.
 
Too many.. finally had to get dev servers at work. The last update didn't even run just at there and spun for 3 hours. Stopped it and said no the prod update. Tyco is terrible... perhaps worse than Microsoft. Lot's more I could say but software will never change.
 
Oh, man, this is too good to be true...

I wrote code for literally 50 years. In all that time, the vast majority of 'my' bugs were caused by 'users' who couldn't walk and chew gum at the same time.

Here's my best example. I wrote a browser app. That's one that runs in an internet browser (chrome, firefox, etc). Most of you should know that the 'address bar' is the area where you key in a web address (bobistheoilguy.com). On apps where names are entered (like mailing addresses for packages you ordered) there will be name entry fields. First name, middle initial, last name, etc. You don't key your name into the 'address bar' at the top of the browser window, you key it into the name fields further down the page. This is obvious-right? Right?!

Not to my two most genius users. They keyed ALL the data into the address bar. I physically showed them three different days what they were doing wrong. A month later, I found out they were entering data into a spreadsheet, because they couldn't get the browser app to work. When I see posts on the internet blaming software, and software writers, I suspect the issues are caused by the users.
 
What are you basing this opinion on? A few quick searches on LinkedIn results in thousands of open QA positions.

Oh, I'm sure QA still exists to an extent…just not as much as before.

My first computer was an Apple ][, and some of the things we have now I couldn't even imagine then, but I still do have that perspective. I'm not Joe Blow User like above. I've worked in IT, chosen to participate in pre-release software, and know how to file bugs.

If there's good, bug-free software being written, tested, and released now, it's not in the mainstream.

Perhaps I've turned too curmudgeonly, or just gotten sick of the tech hamster wheel, but as per the OP's questioning, I think very hard before updating, and make sure there is a way back if need be. More emojis added to the character set, or other fluff that often passes as new features now doesn't count in my book.

Ultimately, these are just tools, and if they work, and do what you need then to do, leaving them as they are may not be such a bad idea.
 
Software QA is largely a thing of the past. Now it's a rush to release to meet sales/marketing deadlines, and have users do the final testing, even for "stable" releases and not just "public betas," which have contributed to the lowering of standards, and expectations (thanks, Google!).

As a user, one can try to file bug reports, but to see them dismissed, or ignored in the tracker doesn't encourage keeping up the effort for all but the most dedicated (or annoyed with a particular bug).

Most software is mature, so a lot of new features aren't essential, making security the biggest reason to update (outside of a manfuacturer's push for new revenue generation though mandatory updates). Many users don't, but the risk/reward ratio still usually works in their favor, as long as aren't part of a targeted group, or engage in riskier behavior.


I agree with this mostly 100%. I think a lot of stuff released today, not just software, is nothing more than some marketing imbecile justifying their job.

Software updates are tired. Remember when Adobe had a software update every 36 hours?
 
Programs work until they don't. It could be a bug or it could be a business logic change.
Programming is simply a translator; taking a business need into a language the computer can understand.
I love programming, but it is just a tool. The business need being addressed is what is important.

Programs work until they don't.
 
Oh, man, this is too good to be true...

I wrote code for literally 50 years. In all that time, the vast majority of 'my' bugs were caused by 'users' who couldn't walk and chew gum at the same time.

Here's my best example. I wrote a browser app. That's one that runs in an internet browser (chrome, firefox, etc). Most of you should know that the 'address bar' is the area where you key in a web address (bobistheoilguy.com). On apps where names are entered (like mailing addresses for packages you ordered) there will be name entry fields. First name, middle initial, last name, etc. You don't key your name into the 'address bar' at the top of the browser window, you key it into the name fields further down the page. This is obvious-right? Right?!

Not to my two most genius users. They keyed ALL the data into the address bar. I physically showed them three different days what they were doing wrong. A month later, I found out they were entering data into a spreadsheet, because they couldn't get the browser app to work. When I see posts on the internet blaming software, and software writers, I suspect the issues are caused by the users.
Guessing data entry user tab key to move not mouse and potentially the tabindex not setup properly in HTML? Happens a lot with our stuff in terms of copying or improper setup.
 
Also the HTML needs to apply focus to the first input field when the page loads. Then the user can do traditional entry by typing and hitting tab for the next field. It would take a deliberate action to move back to the address bar.
 
Oh, I'm sure QA still exists to an extent…just not as much as before.
I am really just asking you to explain. You said, “QA is largely a thing of the past“ without much reasoning. What are you basing this claim on? There are thousands of open QA positions posted right now. When do you define “before”?

If there's good, bug-free software being written, tested, and released now, it's not in the mainstream.
I so strongly disagree with this I wonder if our definitions are different. How do you define “software”? I define it as code running with some sort of UI to solve a problem for an end user. All the apps on your phone, all the functions of your smart TV, many of the websites you visit, all self checkouts, all self banking, all car infotainment systems, and all the applications still on your desktop qualify as software. Software is a bigger part of our lives than at any point in human history and society literally wouldn’t function if what you’re saying is true.

Honestly I’m flabbergasted anyone could say there’s no good software being released now. We use software all day, every day.

The only thing I can wrap my head around the way you’re thinking about this is if you define software narrowly to mean “desktop software” but even then I disagree. No software is bug free, but some software is more practically functional than others. I would strongly argue that MS Office, for example, is more stable now than it was 15 years ago.

I think very hard before updating, and make sure there is a way back if need be. More emojis added to the character set, or other fluff that often passes as new features now doesn't count in my book.

Ultimately, these are just tools, and if they work, and do what you need then to do, leaving them as they are may not be such a bad idea.
We are on totally different planets. See my previous posts for reasoning.
 
I don't think my employers ever just develop software revs for no reason. Usually we have to make new revisions because 1) we found problems in the older stuff, 2) the older stuff is obsoleted and we need to update to new standard (no more 2G and 3G cell phone so we have to go LTE, older nand chips are not produced anymore so we need new controller and firmware to support new nand chips), 3) new stuff is cheaper to produce or can support higher volume that customers demand or automated some manual labor away, 4) customers want new things that are bigger, cooler, faster, etc.

Our tools get updated as need but we do use old stuff that's still working well. I have a scope in the lab that has a 3.5" floppy drive attached to it and is running win 3.1. It is not connected to the internet obviously.
 
Our video and VFX Workstations here are not on the internet (air gap), so security is not an issue.
When I build a new workstation, I put a hand massaged Win 10 on it (all the crap and bloat turned off or removed) - once it runs fine, it never get's an OS update. Programms get updates when they provide significant speed or feature improvement. Works pretty good for us since 20 years.
 
Back
Top