Author Topic: EEVblog #858 - Red Pitaya  (Read 34780 times)

0 Members and 1 Guest are viewing this topic.

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: EEVblog #858 - Red Pitaya
« Reply #50 on: March 12, 2016, 01:26:01 pm »
I realise I'm somewhat of a Luddite, but I've yet to see any real world evidence, as opposed to contrived examples and a very few edge cases, that the profile based adaptive optimisation provides any gains in an overall sense compared to static optimisations.

That would be an extraordinarily difficult thing to do with a real-life application - how can you meanigfully compare, say, a Java application with a C application? In all such cases the quality of the libraries and application, the development timescale and availability of staff would totally dominate any results.

That's really part of my point.

Quote
Quote
We've had similar techniques in the database world for a couple of decades now, the main problem with these techniques is lack of determinism which is a serious frustration when troubleshooting particularly non-functional facets of a system. All that code you diligently tested is now doing something different, but not all the time. I'd much rather have a consistently slowly performing system to troubleshoot I can reproduce than one that sometimes randomly starts running ten or a hundred or a thousand times slower.

I'd argue that your system architecture's function must be insensitive to detailed timing delays, since that's what will occur in production. Of course, when benchmarking performance, it is vital to "warm up" a HotSpot JVM before doing the performance test, but that is neither difficult nor lengthy.

No I'm talking predominently about non-functional aspects, in particular systems running fast one minute and slow the next, frequently by orders of magnitude. One of the key facets in troubleshooting is having a consistently reproducible scenario. The problem is that stochastic methods like this bring indeterministic results (in terms of performance) that are frequently difficult to reproduce in a controlled environment. Also bear in mind that much of the time it is not even possible to test with real data in certain domains such as financial or other areas that include personally identifiable information.

Quote
Quote
My point is that something you non-functionally tested in lower environments is more likely to behave differently in higher environments because it's virtually impossible to come up with a practical test matrix.

In that case the system architecture and implementation is badly defective, unless you restrict "differently" to mean "slower but still correct".

Yes, that is why I said "non-functional". I am sure we've all been on the end of a phone to a call centre when they say "the system's running slow today", or when something unexpectedly takes longer than usual. As an end user, I'd much prefer to have something that consistently runs in, say, 5 seconds than something that runs in 1 second 95% of the time and a minute 5% of the time for example, even though the aggregated time in the latter case is less.

Anyway I'm now waaay off topic!
 

Offline Howardlong

  • Super Contributor
  • ***
  • Posts: 5317
  • Country: gb
Re: EEVblog #858 - Red Pitaya
« Reply #51 on: March 12, 2016, 01:36:33 pm »
I am not sure what I did, but I spent three hours of my life today trying to get the WiFi to work. I did succeed, but in trying to document the steps, I ended up back at square one.

One of the confusing things I am sure is that after making certain changes to the SD card, such as adding the wpa_supplicant.conf, you need to reboot the RP a second time after the change. I am pretty certain I did this, as documented, and it didn't work, but having polished up my "penguin skills" (I love that!) and randomly pushing buttons and reading through dozens of mostly half relevant and out-of-date topics online, I got it to work.

In documenting my changes, I went back to a blank SD card, followed the instructions without my changes first, and it started working. so rather frustratingly now I have no clue what I did to get it to work.

Basically, though, I'll say this...

o Note that by default, the RP tries to work as an AP rather than as a WiFi client which confuses things
o After adding the wpa_supplicant.conf to the SD card from your main computer, don't forget to reboot twice.

 

Offline vodka

  • Frequent Contributor
  • **
  • Posts: 518
  • Country: es
Re: EEVblog #858 - Red Pitaya
« Reply #52 on: March 12, 2016, 07:08:10 pm »
Quote
It destroys the ignorant prejudice that "Java has to be slow because it is interpreted". In other words, with Hotspot type techniques, an emulated processor+program is faster than the real processor+program.
.

Simply is ilogical ,because when you emulated a processor always you need more RAM and ROM space furthermore with the the program loaded that is impossible that go more fastest than an original processor with the compilated program. Unless ,the computer that contains the emulated processor have more resources in RAM , ROM and the computation power . Then that is posible,but isn't fair play.

When i emulated Nintendo64 game in a AMDK6 went bad, even so with the updated graphic card,i had turned  off the sound because only heard beeps.

The Hotspot JVM programmed C++ and above Oriented to Object (So that God catch us  confessed ) :palm:
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19458
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: EEVblog #858 - Red Pitaya
« Reply #53 on: March 12, 2016, 09:02:48 pm »
Quote
It destroys the ignorant prejudice that "Java has to be slow because it is interpreted". In other words, with Hotspot type techniques, an emulated processor+program is faster than the real processor+program.
.

Simply is ilogical ,because when you emulated a processor always you need more RAM and ROM space furthermore with the the program loaded that is impossible that go more fastest than an original processor with the compilated program. Unless ,the computer that contains the emulated processor have more resources in RAM , ROM and the computation power . Then that is posible,but isn't fair play.

No. Read the HP Dynamo report before you repeat incorrect statements. Then you will be surprised, and finally you will understand your errors.

The subject under discussion is speed, not memory usage. There was sufficient RAM for all programs; ROM is irrelevant to this discussion. The same PA-RISC processor and operating system (HP-UX) was used throughout.

From the abstract: "the performance of many +O2 optimized SPECint95 binaries running under Dynamo [i.e. emulated] is comparable to the performance of their +O4 optimized version running without Dynamo.".  Think about that: the more optimised code running normally sometimes performed worse than the less optimised code being emulated. Static compilation of C code simply cannot be very efficient due to C language features (especially the possibility of aliasing). And if you don't believe/understand that, then you should discuss it with the High Performance Computing crowd that have been pushing machine performance professionally for half a century.

Quote
When i emulated Nintendo64 game in a AMDK6 went bad, even so with the updated graphic card,i had turned  off the sound because only heard beeps.

OK; that shows you don't know how to write emulators. Big deal. Fortunately other people do know how to write emulators.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf