Author Topic: [Banter] What is the worst software you have used for its price?  (Read 12454 times)

0 Members and 1 Guest are viewing this topic.

Online free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: [Banter] What is the worst software you have used for its price?
« Reply #75 on: May 22, 2022, 08:11:05 pm »
I did something similar at lecroy booths. take the timebase knob, spin it fast left, fast right, fast left and repeat that like 10 times. Then walk away while the scope goes nuts zooming in and out and displaying that dreaded "triggering..." with the slowly crawling progress bar at the bottom.
Ah, the good ol' "let's do computation in the UI event handler" antipattern.  Causes the events to be queued, slowing down everything. Drives me crazy.

This is not new, and was well known and common in the 8-bit era already, especially games.  It saddens me to hear that even LeCroy fell into this easily avoided UI trap...
The problem is most likely much deeper and may even be intentional. What I have seen on Windows based equipment from several brands is that the CPU load is low even when the equipment is doing heavy processing tasks (like math or analysis). It looks more like they are trying to keep the CPU cool on purpose. The slow UI response can also be a hold-off mechanism.
if i take my old hp infiniium running a 150MHz pentium 1 and windows 95 and 4 meg of ram and spin that timebase button the screen refresh is instantaneous , like an analog scope.
Those lecroy machines have windows 7 , 8 core cpu's with hyperthreading and 64 gigs of ram. you give the timebase knob a couple of fast spins and you will hear the cpu fan go into overload and the thing crawl slower than a drunk snail. those machines are not oscilloscopes. they are very fast digitizers and the rest is don on the pc. Rohde & Schwarz , tek and Hewlettagilentkeypackardsight do all that stuff on custom hardware. the OS is only there for the GUI :  well known platforms with a friendly face for the user.
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 
The following users thanked this post: rsjsouza

Offline tom66

  • Super Contributor
  • ***
  • Posts: 6689
  • Country: gb
  • Electronics Hobbyist & FPGA/Embedded Systems EE
Re: [Banter] What is the worst software you have used for its price?
« Reply #76 on: May 22, 2022, 08:14:34 pm »
if i take my old hp infiniium running a 150MHz pentium 1 and windows 95 and 4 meg of ram and spin that timebase button the screen refresh is instantaneous , like an analog scope.
Those lecroy machines have windows 7 , 8 core cpu's with hyperthreading and 64 gigs of ram. you give the timebase knob a couple of fast spins and you will hear the cpu fan go into overload and the thing crawl slower than a drunk snail. those machines are not oscilloscopes. they are very fast digitizers and the rest is don on the pc. Rohde & Schwarz , tek and Hewlettagilentkeypackardsight do all that stuff on custom hardware. the OS is only there for the GUI :  well known platforms with a friendly face for the user.

It's just bad software design.

The Rigol scopes get a lot of flak for buggy software, but their DS1000Z series, for instance, has a single-core 600MHz ARM CPU running the show.  The UI is snappy and responsive, not quite analog-scope speed, but not far behind.  It doesn't need a fast CPU if the software is designed well.  Heck, even the first-gen iPhone had a similar processor, and it did animations and touchscreen pinch-and-zoom, etc. feeling super responsive throughout. 

We trialed a $20k Tek scope for a while and the interface would often freeze for 1-2 seconds periodically while the scope was doing...something in the background.  That's just unacceptable.  The scope shouldn't do that.  It speaks to poor software design; a lack of parallelism and priority.  Anything taking more than 100ms immediately feels 'laggy' to users.  It wasn't the only reason to discard that scope, but it was a significant one.
 
The following users thanked this post: rsjsouza, MK14, bd139

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 21657
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Re: [Banter] What is the worst software you have used for its price?
« Reply #77 on: May 23, 2022, 12:38:54 am »
Paintshop pro after it was borged by Corel. Installer became crap , spyware , bloatware , now totally unusable. Same happened to Microfx Designer. Corel borged it and ran it into ground.

I still use PSP7 regularly... very lightweight, still easier to use than GIMP, and clearly made with low RAM and slow CPUs in mind; feels like such a flex to preview a Gaussian filter on a 10Mpx image. ;D

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 
The following users thanked this post: free_electron, HighVoltage

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 21657
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Re: [Banter] What is the worst software you have used for its price?
« Reply #78 on: May 23, 2022, 12:51:46 am »
I did something similar at lecroy booths. take the timebase knob, spin it fast left, fast right, fast left and repeat that like 10 times. Then walk away while the scope goes nuts zooming in and out and displaying that dreaded "triggering..." with the slowly crawling progress bar at the bottom.
Ah, the good ol' "let's do computation in the UI event handler" antipattern.  Causes the events to be queued, slowing down everything. Drives me crazy.

Even my Tek TDS460 has -- besides the annoyingly slow-to-respond menus -- an occasional bug which, I suspect goes something like: user input is interrupt triggered, and the front panel encoders go crunchy sometimes.  So just sliding a cursor around can freeze the UI requiring a power cycle.  (Fortunately the power button -- which is also a soft button itself -- does not go through the same front-panel interface, so it doesn't have to be re-plugged.)

HP/Agilent 100% made the best 80s-00s equipment.  I'd take a HP54xxx over anything TDS -- the 460 just happened to be available last time I went shopping.  Which to be clear, has been quite a while.  I've certainly gotten more than my worth from it.

But as for GUI programs, the sheer number of them that handle input or computation in the message thread... yeah, absurd.  No excuses. :palm:


We trialed a $20k Tek scope for a while and the interface would often freeze for 1-2 seconds periodically while the scope was doing...something in the background.  That's just unacceptable.  The scope shouldn't do that.  It speaks to poor software design; a lack of parallelism and priority.  Anything taking more than 100ms immediately feels 'laggy' to users.  It wasn't the only reason to discard that scope, but it was a significant one.

Was this recently?

Like, GUIs aren't even the hard part... you've got gigasamples in the ass end of the thing, just... ask someone once in a while if it feels good to use?!  Bring in a consultant FFS just to check the UI, accessibility features (don't forget colorblindness!), etc.  Graphic designer even, maybe, make it nice and shiny and responsive.  These are basic things, like, two bit web designers handle this stuff...

Tim
« Last Edit: May 23, 2022, 12:56:13 am by T3sl4co1l »
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Online free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: [Banter] What is the worst software you have used for its price?
« Reply #79 on: May 23, 2022, 01:05:47 am »
Paintshop pro after it was borged by Corel. Installer became crap , spyware , bloatware , now totally unusable. Same happened to Microfx Designer. Corel borged it and ran it into ground.

I still use PSP7 regularly... very lightweight, still easier to use than GIMP, and clearly made with low RAM and slow CPUs in mind; feels like such a flex to preview a Gaussian filter on a 10Mpx image. ;D

Tim
i also still use that one. i think 9 was the last version before corel .  and the companion program "aftershot". There's another tool from back then : ACDsee
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 

Offline ebclr

  • Super Contributor
  • ***
  • Posts: 2328
  • Country: 00
Re: [Banter] What is the worst software you have used for its price?
« Reply #80 on: May 23, 2022, 02:21:31 am »
Kicad for sure, has the worst interface in any software I have used,  totally not intuitive
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6231
  • Country: fi
    • My home page and email address
Re: [Banter] What is the worst software you have used for its price?
« Reply #81 on: May 23, 2022, 03:06:07 am »
Even my Tek TDS460 has -- besides the annoyingly slow-to-respond menus -- an occasional bug which, I suspect goes something like: user input is interrupt triggered, and the front panel encoders go crunchy sometimes.  So just sliding a cursor around can freeze the UI requiring a power cycle.
It may not be exactly the same thing, but the underlying problem – coupling each UI state change to a particular update – is the same.

Even in the example of finite impulse response filter analysis as a standalone HTML page, I separated "calculation" from "redraw".  Granted, because the redraw was fast enough even on the slowest machine (with a modern browser; mainly Canvas support), I just chained them together.  If it had been a problem, all I'd need is to add a global variable representing the current redraw timeout.  Whenever recalculate is triggered, it would first disable any existing redraws, and then set a new timeout after calculation completes.

I've used Gtk+ and Qt widget toolkits extensively.  (Gtk+ is pure C, Qt is C++.)  They both are based on an event loop under the toolkit control, so require an event-driven programming approach.  Neither works well with major computation done in the same thread as the UI event loop, and this is typical for all UI toolkits and approaches.  Even so, that –– do everything within the UI event loop; perhaps in the idle handler –– is what the widget toolkit documentation recommends.  Stupid.  You do not even need multiple hardware threads!  All you need is a way to interrupt or time-slice the major computation.  The simplest implementation only requires a timer interrupt, and switching the processor state and stack between the different concurrent tasks.

One of the real tricks is having a way to cancel and/or restart the heavy calculation.  (An atomic flag and an early return from the computation function works well.)  In an oscilloscope, this is not really an option, because –– assuming I have the correct picture of how they work –– the "heavy computation" is actually communication with the dedicated capture hardware, and setting that up.  I can well imagine that a hardware FPGA/ASIC designer without any user interface experience would design this communication to be a full setup information package, instead of a per-variable/feature one, because the former is just so much simpler and robust.  But, it also means that the UI must be very careful as to when it decides to send such setup packages: delay too long, and the UI will be sluggish.  Queue the changes, and you get the "twirl-a-knob-and-it-will-freeze".

No, the control messages need to be categorized by the latencies of their effects, for best results.  Something like changing the trigger level should be basically instant.  Something that causes e.g. a relay to change states will take human-scale noticeable time (a fraction of a second), and therefore must override any of the faster changes.  The UI side needs a configuration state machine that is aware of these latencies, so that it does not bother to e.g. set the input scale if it knows there is already a different scale selected in the UI.  And yes, you indeed want to do the (necessary) highest-latency changes first: I'll leave it as an exercise to realize why.

Simply put, you need an intermediary state machine or filter between the user events and the capture engine, to minimize the latency between UI changes to visible effects.  For hardcore FPGA/ASIC designers who disdain of anything human-scale, this is "ugly" and "silly"; they feel it is an intrusion into their domain.

Similarly, if you write an user interface to communicate with say an USB or Serial or Ethernet-connected device, you need to split the communication into a separate thread (often a state machine), and use a thread-safe queue or similar to pass commands/requests/state changes and results between the two.  Either side also needs that small state change optimizer machine, so that changes are not queued, but combined for efficiency.  Hell, even if you just write a crude tool to display the microphone signal level at the edge of your display, you'll want to use a separate thread for that, so that UI hiccups do not affect the audio stream and vice versa.
For most programmers, this is hard.  It is already hard to switch between event-driven and state machine logic; but to do so just because of getting a responsive UI is just not worth the effort to most –– even if they are paid to do just that.

I definitely blame the developers and programmers for this kind of issues.  They are solvable, and the solutions are well known.  The fuckers are just too lazy or inept to do the work right.  (Then again, the one complaint I always got when I wrote software for living was "it does not need to be perfect; it just needs to look like it works.  We can fix the issues later, when we have time".  So maybe I'm expecting too much from people.)
« Last Edit: May 23, 2022, 03:07:55 am by Nominal Animal »
 
The following users thanked this post: rsjsouza

Online free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: [Banter] What is the worst software you have used for its price?
« Reply #82 on: May 23, 2022, 04:49:46 am »
One of the real tricks is having a way to cancel and/or restart the heavy calculation.  (An atomic flag and an early return from the computation function works well.)  In an oscilloscope, this is not really an option, because –– assuming I have the correct picture of how they work –– the "heavy computation" is actually communication with the dedicated capture hardware, and setting that up.  I can well imagine that a hardware FPGA/ASIC designer without any user interface experience would design this communication to be a full setup information package, instead of a per-variable/feature one, because the former is just so much simpler and robust.  But, it also means that the UI must be very careful as to when it decides to send such setup packages: delay too long, and the UI will be sluggish.  Queue the changes, and you get the "twirl-a-knob-and-it-will-freeze".
no. the problem is that they do everything in software on the pc side. They copy a bunch of samples from the acquisition board and then do a bunch of data crunching and visualise it.

For example peak detect. you have to search the array for the minimum and maximum value. if you just did that on a million samples. that takes some time... if it is done in hardware it takes zero time ( as it is done while the data is being acquired. two digital comparators looking at what comes out of the ADC before it even goes into memory. once the memory is full you have you min and max right there. no need to go over it once more.

Another gripe of mine : aliasing on a scope screen. There is only so many pixels horizontally. so how do you cram 1 million samples on a 1024x800 screen ? do you only take one every 1000 ? do you draw a line to the next one  ? we all know there is a risk that creates an aliased image. The correct way to do this is to take a block of 1000 samples , find the minimum and maximum in that block and draw a vertical line ( not a line to next "sample" ) from min to max. then process the next block. so your horizontal step is simple a counter for the block being processed , while the vertical is a line from min to max. bye bye aliasing. the problem : that takes a lot of cpu cycles to do . in hardware ? can be done as the data is being acquired its a digital comparator to find min and max and at the end of 1000 samples store min and max. so you get a secondary array containing the actual stuff that needs visualisation. feed that data straight into a hardware overlay ( That's how the infiniiums did it. the scope hardware writes directly to video memory bypassing the entire pc and operating system. if you did an alt-printscreen you got a nice picture of all the windows icons and menus but the actual scope grid was a black canvas... ( actually a specific RGB value : the hardware knew it could only overlay anything having that specific RGB mark. Think of a green screen like for tv newsrooms or weather reports. anything with that specific rgb is to be replaced by hardware overlay)
if you altered the timebase post-capture they simply instructed the hardware to rescan its memory and build the new min/max array ( those scope have 800x600 lcd monitors so you only need like 512 pairs horizontally... ). since the acquisition hardware can cycle the memory at 4Ghz (the scope can do 4Gs/s so it has the memory speed ) , crunching 4 megasamples takes 1 milliseconds to find the 800 min/max pairs. The LCD refresh rate is 100Hz... that means this new visualisation happens much faster than even the monitor can follow.

On the newfangled machines they need to first pump all the data rom the acquisition memory over a relatively slow bus ( much slower than the acquisition memory ) into the pc memory. and then a bunch of algorithms have to run sequenctually and iteratively over the data to find what they need. in many scopes these days the acquisition memory is much much larger than the pc memory, so there is no way to pump all data over ( with a dma like mechanism for example) . the pc side simply doesn't have enough storage space. storing it on ssd or spinning rust would be an even slower nightmare... so that means the pc side must access that memory through a bus that is much slower than it could even access its own memory.

that's where the bottleneck lays. it's a big dataset that needs moving or accessing through a slow bus. much slower than acquisition or main memory. and then you need to unleash iterative and sequential operations to find what is of interest.
Shoving that task onto the acquisition hardware is the correct solution. that memory and logic is much much faster than the pc will ever be since it can run at acquisition speed.

that's why all those cheap scope are using a simple ARm processor and a big fat fpga hooked into the acquisition memory. they don't copy data or move it or dig in it from the arm side. the arm tells the hardware build me an array of screensize with this kind of information. the hardware does that before the current lcd redraw cycle has even completed ( in the old days of vacuum balloons they did it in the vertical flyback. )

Look at those older 54645D oscilloscopes. they have 4megasample memory. there is 16 megabyte total ( 8meg for the two analog channels , 4 each , and 8 meg for the 16 digital channels ). that thing is ran by a motorola 68000 clocked at 8MHz ...
its display is a picture tube with 600x400 resolution. you can twirl that timebase knob as hard was you can. that screen refresh is instantaneous without flickering or lag or aliasing.
It is not possible to do that with software. that 68K has a 16 bit data bus. so even if you were to pair the memories you would need to move 8megawords of data at 8MHz ?  that would take one second alone. and you haven't done anything yet. counting and doing the compare to find min max on each . let's assume you need 10 instruction for each sample. you are looking at 10 seconds to do a screen redraw .. and there is tons of other things to do : scan the keyboard, encoders, the gpib , maybe run an FFT to show the spectrum of the analog channels. how about doing bus decoding or pattern matching on the logic data ?

The acquisition system runs at a 100MHz clock ... find the min/max pairs on a 4meg deep block ? it can do that 25 times a second ! (it has parallel access to all data)

« Last Edit: May 23, 2022, 05:01:17 am by free_electron »
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 
The following users thanked this post: tom66, Nominal Animal

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: [Banter] What is the worst software you have used for its price?
« Reply #83 on: May 23, 2022, 07:13:22 am »
Linux is simultaneously a good and bad thing. It's as good as the price we pay, because the "support" is "piss off, you should know this, we learnt and so now must you, and learn all the new acronyms and syntax which some autistic 'community' assumes you knew from birth, and we know you have a busy life, but spend a month trawling sourceforge, then compile... rinse and repeat"

I know the benefits and pitfalls of ALL mainstream Mac/Win/Lin OS', and use them with caution and wisdom.

PS: I am autistic, we aren't ALL oblivious to HUMANS being the ones using products, and needing clear, simple guides. Linux 'community' is the reason it's not a full-blown, worldwide commercial phenomenon, because aside from a trillion conflicting variants, there's a CLEAR LACK of the ability to understand what ACTUAL 9-5 humans want, need and use. As for all the "open source is magic" mindset - utter tripe - if it WORKS and I can PAY ££ FOR SUPPORT, and not pay my valuable time chasing my tail adn tearing my hair out, I'll gladly line your pockets, screw "open source" - it's an ego massage.
« Last Edit: May 23, 2022, 07:18:48 am by eti »
 

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6231
  • Country: fi
    • My home page and email address
Re: [Banter] What is the worst software you have used for its price?
« Reply #84 on: May 23, 2022, 08:02:14 am »
It is not possible to do that with software.
Not with dumb software and dumb data buffering schemes, no..

But let's say you have a 8-bit ADC and 64-byte cachelines, and as you receive the data, you construct a parallel lookup of min-max values, filling another cacheline per 32 cachelines (2048 samples).  You've now dropped the memory bandwidth required to find min-max for any range to 1/32th, except that the start and end points have a granularity of 64 samples.  (So do those cachelines separately, I guess.)

Similarly, if you can reorder the received data so that you get the cachelines across waveforms, you can construct the display from left to right and use all sorts of clever spanning techniques.  Even antialiased lines boils down to lots and lots of additions, and a few not-too-large lookup tables (that depend on the time base and such).

Using an ARM or Intel/AMD core for that kind of stupid work makes no sense.  The cores are slow at that sort of stuff, and you're paying for nothing there.  Instead, stick a DSP or similar between the acquire buffer and the UI processor, so that the UI processor computes and sets the lookup tables and memory transfers, and the DSP just spits out intensity slices (say, 5-bit full-height pixel columns) that the UI processor then just composes into the display.

To do this sort of stuff right, one must think of the data flow.  A very similar thing really bugs me with most simulator software running on HPC clusters: they calculate, then communicate, then calculate, then communicate, and so on, instead of doing them both at the same time.  Why?  Because it is hard to think of what data needs to be transferred after the next step, when the next step is yet to be calculated.  The data does need to be present before the next time step is calculated, so essentially your data transfers need to be at least one step ahead, and that means predictive and/or heuristic transfers without false negatives (you can transfer extra, but you need to transfer all that are needed), node load balancing, and so on...  Just too hard for programmers who can always just tell professors to buy more and newer hardware.

Linux is simultaneously a good and bad thing. It's as good as the price we pay, because the "support" is "piss off, you should know this, we learnt and so now must you, and learn all the new acronyms and syntax which some autistic 'community' assumes you knew from birth, and we know you have a busy life, but spend a month trawling sourceforge, then compile... rinse and repeat"
No, that's not it.

For open source communities, end users are a net negative: a cost, not a benefit.  Only those who contribute back, somehow, are worth the effort of helping.  What "actual 9-5 humans want, need and use" is absolutely, completely irrelevant.  This is why Linux greybeards laugh at you when you say something like "you need to do X so that Linux can become as popular as Y".  It is as silly to us as Insta-gran and Fakebook "influencers" demanding free food and accommodation.

As to why paid Linux end-user support is relatively hard to find, I think it is because getting such an commercial venture going is highly risky.  It is relatively simple to set up Linux user support in an organization, but as a commercial service, you have huge risks from customers who vent their disappointment at Linux not being a drop-in Windows replacement at you, ruining your reputation at the same time.  The risks aren't worth the gains.
I mean, I consider you, eti, a professional person.  But I for sure would not like to put anyone under your ire at Linux and open source.  The £20 or so an hour you'd be willing to pay would not be worth it.

Perhaps it is time to just admit that Linux and open source is not for you.  And that's fine; it's not supposed to be for everyone, it's just a tool among others.
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: [Banter] What is the worst software you have used for its price?
« Reply #85 on: May 23, 2022, 08:49:12 am »
It is not possible to do that with software.
Not with dumb software and dumb data buffering schemes, no..

But let's say you have a 8-bit ADC and 64-byte cachelines, and as you receive the data, you construct a parallel lookup of min-max values, filling another cacheline per 32 cachelines (2048 samples).  You've now dropped the memory bandwidth required to find min-max for any range to 1/32th, except that the start and end points have a granularity of 64 samples.  (So do those cachelines separately, I guess.)

Similarly, if you can reorder the received data so that you get the cachelines across waveforms, you can construct the display from left to right and use all sorts of clever spanning techniques.  Even antialiased lines boils down to lots and lots of additions, and a few not-too-large lookup tables (that depend on the time base and such).

Using an ARM or Intel/AMD core for that kind of stupid work makes no sense.  The cores are slow at that sort of stuff, and you're paying for nothing there.  Instead, stick a DSP or similar between the acquire buffer and the UI processor, so that the UI processor computes and sets the lookup tables and memory transfers, and the DSP just spits out intensity slices (say, 5-bit full-height pixel columns) that the UI processor then just composes into the display.

To do this sort of stuff right, one must think of the data flow.  A very similar thing really bugs me with most simulator software running on HPC clusters: they calculate, then communicate, then calculate, then communicate, and so on, instead of doing them both at the same time.  Why?  Because it is hard to think of what data needs to be transferred after the next step, when the next step is yet to be calculated.  The data does need to be present before the next time step is calculated, so essentially your data transfers need to be at least one step ahead, and that means predictive and/or heuristic transfers without false negatives (you can transfer extra, but you need to transfer all that are needed), node load balancing, and so on...  Just too hard for programmers who can always just tell professors to buy more and newer hardware.

Linux is simultaneously a good and bad thing. It's as good as the price we pay, because the "support" is "piss off, you should know this, we learnt and so now must you, and learn all the new acronyms and syntax which some autistic 'community' assumes you knew from birth, and we know you have a busy life, but spend a month trawling sourceforge, then compile... rinse and repeat"
No, that's not it.

For open source communities, end users are a net negative: a cost, not a benefit.  Only those who contribute back, somehow, are worth the effort of helping.  What "actual 9-5 humans want, need and use" is absolutely, completely irrelevant.  This is why Linux greybeards laugh at you when you say something like "you need to do X so that Linux can become as popular as Y".  It is as silly to us as Insta-gran and Fakebook "influencers" demanding free food and accommodation.

As to why paid Linux end-user support is relatively hard to find, I think it is because getting such an commercial venture going is highly risky.  It is relatively simple to set up Linux user support in an organization, but as a commercial service, you have huge risks from customers who vent their disappointment at Linux not being a drop-in Windows replacement at you, ruining your reputation at the same time.  The risks aren't worth the gains.
I mean, I consider you, eti, a professional person.  But I for sure would not like to put anyone under your ire at Linux and open source.  The £20 or so an hour you'd be willing to pay would not be worth it.

Perhaps it is time to just admit that Linux and open source is not for you.  And that's fine; it's not supposed to be for everyone, it's just a tool among others.

“Not for you”? Lol. I’ve been using it as a seasoned pro since 2004. That’ll be a common mistake of assuming you know someone online.

The issue with Linux is not so much Linux as the arrogance of the obsessives and how they  decry “evil” (read: hugely hard working, clever  and deservedly successful) Microsoft etc. Sour grapes sure make a lot of whine.

Linux fans whine about the fact that the hardware that is designed and made for a profitable market, IE the gargantuan and profitable desktop and server market, isn’t able to run perfectly on Linux etc etc. Hey guys, make your own hardware if you’re that upset (hang on, that would require a large paying user base and parts market that form around it due to it being dominant and used EVERYWHERE FOR DECADES)

Whiners wine. I’ve heard (and stupidly taken part in) every conceivable, predictable pro Linux debate ever online, and the same junk goes round and round for years. Windows pays bills, work involving windows pays bills. Work done on Macs pays bills. Servers running Linux pays huge bills too. Desktop Linux is what’s left at the end of the meal. That’s how it panned out.

If they want to be successful TRULY, then it’s time to walk out of the pity party, go home and  put on their suits and go do some selling, never mind everyone else. Linux people love to evangelise and criticise. That massages egos but doesn’t pay well.
« Last Edit: May 23, 2022, 08:59:28 am by eti »
 
The following users thanked this post: free_electron

Offline Kjelt

  • Super Contributor
  • ***
  • Posts: 6460
  • Country: nl
Re: [Banter] What is the worst software you have used for its price?
« Reply #86 on: May 23, 2022, 08:56:39 am »
Siemens Teamcenter
Clearcase
Both are bureaucratic bloatware programs that should only be sold and used in Nort Korea.

Eagle although I use it Who TF comes up with the stupid idea that copying some symbolsfrom one schematic to the other requires the user to manually type CUT whileit is a copy and not a cut go to another schematic and type PASTE. It is a GUI a rightmouse click should suffice for this..... unbelievable.
 
The following users thanked this post: bd139

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 26878
  • Country: nl
    • NCT Developments
Re: [Banter] What is the worst software you have used for its price?
« Reply #87 on: May 23, 2022, 09:09:05 am »
It is not possible to do that with software.
Not with dumb software and dumb data buffering schemes, no..

But let's say you have a 8-bit ADC and 64-byte cachelines, and as you receive the data, you construct a parallel lookup of min-max values, filling another cacheline per 32 cachelines (2048 samples).  You've now dropped the memory bandwidth required to find min-max for any range to 1/32th, except that the start and end points have a granularity of 64 samples.  (So do those cachelines separately, I guess.)

Similarly, if you can reorder the received data so that you get the cachelines across waveforms, you can construct the display from left to right and use all sorts of clever spanning techniques.  Even antialiased lines boils down to lots and lots of additions, and a few not-too-large lookup tables (that depend on the time base and such).
The reality is that you can't do it in hardware either (until recently; GPUs are becoming more mainstream in embedded systems). Think about going through >100MB of data and process it in a timely matter. So clever sub-sampling techniques are used to create an image that represents the minimum / maximum while having some aliasing on purpose to indicate there is an anomaly in the signal. After all, an oscilloscope is intended to provide meaningfull information about a signal even if the individual periods can not be shown. A simple test you can do is by acquiring a frequency sweep of a reasonably high frequency with a small span. This will reveal the sub sampling.
« Last Edit: May 23, 2022, 09:11:09 am by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline HighVoltage

  • Super Contributor
  • ***
  • Posts: 5468
  • Country: de
Re: [Banter] What is the worst software you have used for its price?
« Reply #88 on: May 23, 2022, 09:25:40 am »
m "aftershot". There's another tool from back then : ACDsee

Don't laugh....
I am still on ACDSee Photo Manager 2009.
It is fast and quick in all aspects.
Every newer version I tried just sucked.

 

There are 3 kinds of people in this world, those who can count and those who can not.
 
The following users thanked this post: free_electron

Offline zzattack

  • Regular Contributor
  • *
  • Posts: 126
  • Country: nl
Re: [Banter] What is the worst software you have used for its price?
« Reply #89 on: May 23, 2022, 10:39:17 am »
Embarcadero C++Builder.. their raison d'être has to be companies that are overcommitted to their current software base. They do have excellent marketing and there was a time when their VCL (from Delphi) offered advantages over competitors.

While they claim to be the premier platform for rapid application development on Windows/Android/iOS/whatever-the-hell-else, allow me to just briefly highlight some of its very basic shortcomings:

  - frequent compiler bugs
  - longstanding STL issues
  - code editor is super annoying with: 
    * undo/redo buffer corrupting frequently
    * inability to set custom hotkeys
    * inability to disable cursor-past-end-of-line
    * no block-mode editing
    * ctrl+arrow key navigation skips over nearly all common code symbols
  - opening a file from the project browser opens that file about 50% of the time, the other 50% another seemingly random window/tab receives focus
  - terrible UI editor
    * no undo AT ALL
    * everything visual studio and Qt do right, Embarcadero does wrong
    * inheritance of UI components requires manually updating all instances where component is reused
  - debugger is next to useless; this is probably the biggest productivity killer
    * no inspection of STL container types
    * accuracy of call stacks is hit and miss
    * inability to inspect local vars of calling function
    * frequently crashes to a point where system requires reboot
    * about 80% of the time variables cannot be inspected, showing only "???"
    * data breakpoints cannot have conditions
  - no parallel compilation until recently (acquired a 3rd party plugin to do so, buggy)
  - contents of project files change every time they are opened; terrible for version control
  - custom compiler/linker, still based on clang 5.0, but not supporting most compiler switches

Absolutely abysmal how debugging on a PIC8 with MPLAB is a less frustrating experience than working with this IDE when targeting Win32 for a modern desktop application.
 
The following users thanked this post: MK14, harerod

Offline RedLionTopic starter

  • Regular Contributor
  • *
  • Posts: 60
  • Country: lu
  • Professional power dissipator
Re: [Banter] What is the worst software you have used for its price?
« Reply #90 on: May 23, 2022, 11:19:42 am »
I'd like to add MathCad to the list.
I don't even get the point of it, like what problem are they trying to solve?
The syntax is so confusing that you are forced to use the drop down menus, the UI is MS Office, but worse, and of course it's nice and slow.
It's like someone thought "what if we made MATLAB, but worse?"
Oh well, at least it's reasonably cheap.
We burn money we don't have
From shareholders we don't like
To develop products we can't sell
 

Offline Zero999

  • Super Contributor
  • ***
  • Posts: 19491
  • Country: gb
  • 0999
Re: [Banter] What is the worst software you have used for its price?
« Reply #91 on: May 23, 2022, 12:46:11 pm »
It is not possible to do that with software.
Not with dumb software and dumb data buffering schemes, no..

But let's say you have a 8-bit ADC and 64-byte cachelines, and as you receive the data, you construct a parallel lookup of min-max values, filling another cacheline per 32 cachelines (2048 samples).  You've now dropped the memory bandwidth required to find min-max for any range to 1/32th, except that the start and end points have a granularity of 64 samples.  (So do those cachelines separately, I guess.)

Similarly, if you can reorder the received data so that you get the cachelines across waveforms, you can construct the display from left to right and use all sorts of clever spanning techniques.  Even antialiased lines boils down to lots and lots of additions, and a few not-too-large lookup tables (that depend on the time base and such).

Using an ARM or Intel/AMD core for that kind of stupid work makes no sense.  The cores are slow at that sort of stuff, and you're paying for nothing there.  Instead, stick a DSP or similar between the acquire buffer and the UI processor, so that the UI processor computes and sets the lookup tables and memory transfers, and the DSP just spits out intensity slices (say, 5-bit full-height pixel columns) that the UI processor then just composes into the display.

To do this sort of stuff right, one must think of the data flow.  A very similar thing really bugs me with most simulator software running on HPC clusters: they calculate, then communicate, then calculate, then communicate, and so on, instead of doing them both at the same time.  Why?  Because it is hard to think of what data needs to be transferred after the next step, when the next step is yet to be calculated.  The data does need to be present before the next time step is calculated, so essentially your data transfers need to be at least one step ahead, and that means predictive and/or heuristic transfers without false negatives (you can transfer extra, but you need to transfer all that are needed), node load balancing, and so on...  Just too hard for programmers who can always just tell professors to buy more and newer hardware.

Linux is simultaneously a good and bad thing. It's as good as the price we pay, because the "support" is "piss off, you should know this, we learnt and so now must you, and learn all the new acronyms and syntax which some autistic 'community' assumes you knew from birth, and we know you have a busy life, but spend a month trawling sourceforge, then compile... rinse and repeat"
No, that's not it.

For open source communities, end users are a net negative: a cost, not a benefit.  Only those who contribute back, somehow, are worth the effort of helping.  What "actual 9-5 humans want, need and use" is absolutely, completely irrelevant.  This is why Linux greybeards laugh at you when you say something like "you need to do X so that Linux can become as popular as Y".  It is as silly to us as Insta-gran and Fakebook "influencers" demanding free food and accommodation.

As to why paid Linux end-user support is relatively hard to find, I think it is because getting such an commercial venture going is highly risky.  It is relatively simple to set up Linux user support in an organization, but as a commercial service, you have huge risks from customers who vent their disappointment at Linux not being a drop-in Windows replacement at you, ruining your reputation at the same time.  The risks aren't worth the gains.
I mean, I consider you, eti, a professional person.  But I for sure would not like to put anyone under your ire at Linux and open source.  The £20 or so an hour you'd be willing to pay would not be worth it.

Perhaps it is time to just admit that Linux and open source is not for you.  And that's fine; it's not supposed to be for everyone, it's just a tool among others.

“Not for you”? Lol. I’ve been using it as a seasoned pro since 2004. That’ll be a common mistake of assuming you know someone online.

The issue with Linux is not so much Linux as the arrogance of the obsessives and how they  decry “evil” (read: hugely hard working, clever  and deservedly successful) Microsoft etc. Sour grapes sure make a lot of whine.

Linux fans whine about the fact that the hardware that is designed and made for a profitable market, IE the gargantuan and profitable desktop and server market, isn’t able to run perfectly on Linux etc etc. Hey guys, make your own hardware if you’re that upset (hang on, that would require a large paying user base and parts market that form around it due to it being dominant and used EVERYWHERE FOR DECADES)

Whiners wine. I’ve heard (and stupidly taken part in) every conceivable, predictable pro Linux debate ever online, and the same junk goes round and round for years. Windows pays bills, work involving windows pays bills. Work done on Macs pays bills. Servers running Linux pays huge bills too. Desktop Linux is what’s left at the end of the meal. That’s how it panned out.

If they want to be successful TRULY, then it’s time to walk out of the pity party, go home and  put on their suits and go do some selling, never mind everyone else. Linux people love to evangelise and criticise. That massages egos but doesn’t pay well.
Linux developers don't give a toss about whether their software is used or not. The situation regarding help and support with Linux is similar to those asking questions on this forum.  He explained it quite well at the end of the post linked below:
https://www.eevblog.com/forum/programming/rust-is-political/msg4188961/#msg4188961

In other words, to the developers, most of the time users are just a pain in the bum. They ask silly questions. The only time they're helpful is when they find bugs, but for a developer to help you, you need to convince them you've tried everything and researched the problem properly.

This doesn't count the Linux fanboys who are convinced it's the best thing ever and everyone should use it.
 
The following users thanked this post: free_electron

Offline Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6231
  • Country: fi
    • My home page and email address
Re: [Banter] What is the worst software you have used for its price?
« Reply #92 on: May 23, 2022, 01:10:58 pm »
Perhaps it is time to just admit that Linux and open source is not for you.  And that's fine; it's not supposed to be for everyone, it's just a tool among others.
“Not for you”? Lol. I’ve been using it as a seasoned pro since 2004. That’ll be a common mistake of assuming you know someone online.
If using an OS made me that unhappy, I'd just switch.

I do not believe you can be much of a pro when you do not even understand the basic operating principles of the developer communities, and so deeply, so emotionally, hate the software and its developers.

The issue with Linux is not so much Linux as the arrogance of the obsessives and how they  decry “evil” (read: hugely hard working, clever  and deservedly successful) Microsoft etc. Sour grapes sure make a lot of whine.
Eh?  Past business practices are what made Microsoft evil.  It is much less evil now.  I personally wouldn't even use "evil" for MS anymore; especially not compared to social media companies.

Claiming that anyone who thought that was only "arrogant obsessive sour grapes who whine a lot", is just lying.  Perhaps it helps you feel better about yourself, but it has nothing to do with the truth.  If you ignore how open source developer communities work, and assume they are just your personal support forum, it's your own damn fault you're ignored and not helped, not theirs.

In other words, to the developers, most of the time users are just a pain in the bum. They ask silly questions. The only time they're helpful is when they find bugs, but for a developer to help you, you need to convince them you've tried everything and researched the problem properly.

This doesn't count the Linux fanboys who are convinced it's the best thing ever and everyone should use it.
True.  Like I said, time, effort, and knowledge is currency.  Spend some to help the developers, and they'll help you back.

I'm not sure which group I have more practical trouble with, the ones who insist on using the term "open sores", or the Linux or FOSS fanbois.
Reminds me of the time when I was seven or so, and hammered about a thousand nails into a small piece of log.  I made a hedgehog!
Not exactly a good use of a hammer and nails, in hindsight.
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
Re: [Banter] What is the worst software you have used for its price?
« Reply #93 on: May 23, 2022, 01:11:58 pm »
Linux developers don't give a toss about whether their software is used or not. The situation regarding help and support with Linux is similar to those asking questions on this forum.  He explained it quite well at the end of the post linked below:
https://www.eevblog.com/forum/programming/rust-is-political/msg4188961/#msg4188961

In other words, to the developers, most of the time users are just a pain in the bum. They ask silly questions. The only time they're helpful is when they find bugs, but for a developer to help you, you need to convince them you've tried everything and researched the problem properly.

This doesn't count the Linux fanboys who are convinced it's the best thing ever and everyone should use it.

Oh hell I'm not getting involved in that thread at all.

You're right. I don't get involved in open source any more. I used to. No one gives a shit and most of the developers I've had the misfortune of having to deal with are basically comic book guy from the Simpsons. Some of them are positively mentally unwell. I know that's cruel but it's true. I've tried over and over to get things fixed. I've tried fixing them myself and PR'ing them in as requested. I either get a face full of crap or silence. Waste of time so I don't bother. I merely leverage the pile of shit to make money. I don't even do that now - I prefer to tell other people what to do for more money and not have to deal with that level of problem  :-DD

What's even worse is it's the same with paid support options from Redhat etc as well. They employ a lot of the core developers and it's impossible even getting them to fix or patch something.

Then again that's most software companies. I will notable exclude Apple and Microsoft on that as I have actually had defects acknowledged and fixed from both vendors on a non galactic timescale.
 

Offline eugene

  • Frequent Contributor
  • **
  • Posts: 493
  • Country: us
Re: [Banter] What is the worst software you have used for its price?
« Reply #94 on: May 23, 2022, 03:44:28 pm »
Since the discussion seems headed in the direction of "the worst software ever is anything FOSS" I'll counter with this: over the decades, the worst software I've used were programs that I paid some solo hobbyist $20 for.

(Bit of advice: if you write a post and find that most of the content is just rehashing your tired old complaint about some generalized group of people you disagree with, then don't press the "Post" button.)
90% of quoted statistics are fictional
 
The following users thanked this post: nctnico

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 21657
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Re: [Banter] What is the worst software you have used for its price?
« Reply #95 on: May 23, 2022, 03:48:18 pm »
In general, less popular / commonly used software is worse than very popular software.  Cost is irrelevant, or at best weakly correlated.  So, take your pick.


(Bit of advice: if you write a post and find that most of the content is just rehashing your tired old complaint about some generalized group of people you disagree with, then don't press the "Post" button.)

"But I need to know that everyone hates the same things I do!!!"

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 
The following users thanked this post: MK14

Offline eugene

  • Frequent Contributor
  • **
  • Posts: 493
  • Country: us
Re: [Banter] What is the worst software you have used for its price?
« Reply #96 on: May 23, 2022, 04:19:33 pm »
"But I need to know that everyone hates the same things I do!!!"

The only effective cure for that problem is to be less hateful.
90% of quoted statistics are fictional
 

Offline bd139

  • Super Contributor
  • ***
  • Posts: 23018
  • Country: gb
Re: [Banter] What is the worst software you have used for its price?
« Reply #97 on: May 23, 2022, 04:28:53 pm »
It's not hate, it's experience.

To quote my father: "If you put your dick in a crocodile, you're not going to hate the crocodile for what it does, but you are going to learn not to put your dick in a crocodile again".
 

Offline hans

  • Super Contributor
  • ***
  • Posts: 1637
  • Country: nl
Re: [Banter] What is the worst software you have used for its price?
« Reply #98 on: May 23, 2022, 05:26:25 pm »
There are people that take software as serious as a political view or religion. The ones that don't want to use Facebook, Google, Whatsapp, Telegram.. anything that's proprietary or "has data". If I ask them to play an online game, they say they can't, because  their open source NVIDIA GPU driver is still bodged on Debian.. an OS where loading a decently performing (but non-supported) binary-blob driver is considered a sin, and moreover they don't want to boot into Windows as they haven't done so in months and it will likely result in a update-reboot loop for 3 days straight, at which point the opportunity of 'lets play a game' has sadly passed. I've no experience with Mac because their hardware pricing & repair policy is atleast as stupid.

I have thrown in some personal frustrations in that hypothetical story as I have "things to hate" on all platforms or operating systems. But I've met plenty of (CS) people on uni that were close or similar to this mindset.  And this is something I hate even more than having something against 1 piece of software; rather having contrived such a small world of possibilities that it's impossible to do anything productive or fun. In the end, for me computers and software are just tools. I prefer to use Linux for embedded programming, because in my workflow the tools are more readily available and easier to use. I've used Visual Studio, Keil and IAR software before, and I can make things work with it, but I'd rather not again. I rather don't want to go anywhere near the terminal when I see a Windows machine, let alone use WSL or jump through a thousand hoops to install cygwin, CMake and GCC so CLion (my favorite IDE) can finally compile my hello world program for an ARM processor. But even as a daily Linux user, I still prefer to use Altium for PCB design, as that's what I've learned at several companies and am most productive with. I can't ever go back to Eagle or KiCad, those tools and UIs feel like a stone age.

But going back to "bad software", I also cast my vote to Mentor Graphics. I recall my grad internship where they showed me DxDesigner as "the" tool for cooperative schematic design. But even after creating a simple schematic for a LED buck converter, that tool had managed to corrupt it's design database on a solo design project. HOW? How is this even supposed to work OK in a multi-user design environment?  IIRC it even had a (shortcut) button to "fix database" -- it's apparently a normal thing to happen more often, so obviously they implemented a "fix" for it. Atleast equally bad was PADS.. that software DRC engine and Gerber exporter has resulted in several PCB designs modified with a Dremel.

 
The following users thanked this post: MK14, bd139, Nominal Animal

Online free_electron

  • Super Contributor
  • ***
  • Posts: 8517
  • Country: us
    • SiliconValleyGarage
Re: [Banter] What is the worst software you have used for its price?
« Reply #99 on: May 23, 2022, 11:39:14 pm »
It is not possible to do that with software.
Not with dumb software and dumb data buffering schemes, no..

But let's say you have a 8-bit ADC and 64-byte cachelines, and as you receive the data, you construct a parallel lookup of min-max values,
that's what they do. they stream the data from adc to memory. there is a min/max detector looking at every sample passing by. after x samples the pair is stored in a visualisation buffer.
if you change the timebase post acquisition they simply instruct that hardware to stream the data from ram , back into ram so it flies by that min/max detector again. it's a circular buffer. you fill it from the adc. post processing ? send the output of the buffer back to the input and to a once-over of all addresses.
Professional Electron Wrangler.
Any comments, or points of view expressed, are my own and not endorsed , induced or compensated by my employer(s).
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf