Author Topic: who uses fpga's  (Read 4805 times)

0 Members and 1 Guest are viewing this topic.

Offline james_s

  • Super Contributor
  • ***
  • Posts: 14407
  • Country: us
Re: who uses fpga's
« Reply #125 on: April 13, 2021, 09:56:53 pm »
Yes and you can hammer in nails with a crowbar, or you can pry things apart with a claw hammer, they're both made of tool steel so they're the same thing in a sense and if crowbars were cheaper than hammers you could build a house using crowbars instead so the market would have to take that into account.

Yes FPGAs and GPUs (and CPUs and ASICs and...) are made of silicon and comprised of logic gates and other various elements, and yes they can do some of the same tasks. Yes you can make a GPU (of sorts) out of an FPGA but you cannot make an FPGA out of a GPU. The FPGA is a massive digital breadboard that you can "wire" into any sort of digital device you want, THAT is what makes it so powerful, it is versatile, it is flexible, a GPU on the other hand is a hardwired device, it will always be a GPU, it is a microprocessor, it cannot do anything at all without code and you can never change what it is, you cannot add a new instruction or tweak an existing one to work differently, the design is cast in stone.

An FPGA is a block of clay, a GPU is a ceramic mug. You can make a mug out of a block of clay, it probably won't be as nice as a mug you can go buy from the store but you can make it exactly the way you like, precisely the dimensions, shape and design you want, your imagination is the limit. You're not limited to a mug either, you can make a vase too, or a flower pot, or a floor tile, or a trivet to set hot pans on, or a candle holder, a sculpture, or if you're like some of the kids in my Ceramics class in highschool you can make a bong and hope the teacher doesn't realize what it is. Blocks of clay are not really competing with ready made mugs, making your own mug will probably cost more, depending on your skill it will probably not turn out as nice, and it will certainly take a lot more time and effort but in exchange you get the flexibility to make whatever you want, to specify every detail, to tune and tweak the design in any way imaginable.

Maybe you need one very special mug designed specifically for some very unique need that will never be made in volumes sufficient to have them mass produced, or maybe you have an idea for a new type of mug that you wish to test out in the field before committing to the considerable cost of manufacturing a batch of tens of thousands of them, THAT is what FPGAs are good for, and THAT is what people pay the money for. If they need a GPU they buy a GPU, but a GPU is useless if you need something that doesn't involve mathematical calculations of the sort used to render 3D images.
 

Offline ebastler

  • Super Contributor
  • ***
  • Posts: 3877
  • Country: de
Re: who uses fpga's
« Reply #126 on: April 13, 2021, 10:04:53 pm »
Sorry, but I think I do have a point here.  8)

GPUS and FPGAS can actually run the same task.   AND!  U can make a gpu on an fpga!  Its another kind of microcontroller.
If fpgas were cheaper than GPUS then it would cheaper to make a gpu on an fpga.   

Both a gpu and an fpga are digital logic,  they are the same thing, wires connecting transistors.  A gpu is an fpga in the form of a gpu with the programmality removed.

So the market would have to take it into account to a degree.

This is either a rather lame trolling attempt or an indication of rather muddled thinking.
I'm out of here, have fun with your digital fpga microcontroller gpu logic programmality transistors.
 

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 93
  • Country: au
Re: who uses fpga's
« Reply #127 on: April 13, 2021, 10:20:22 pm »
You can emulate an fpga with a gpu.
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 14407
  • Country: us
Re: who uses fpga's
« Reply #128 on: April 13, 2021, 10:27:18 pm »
You can emulate an fpga with a gpu.

No, you can't.

You could emulate some of the behavior modeled by an FPGA, but not any more than you can simulate any electronic circuit on a general purpose computer, it won't be the same thing and it won't be useful for the sort of applications that FPGAs are good for. Going back to an earlier example, even Pong cannot be emulated at a gate level anywhere near real-time on even a modern high end PC.
 
The following users thanked this post: Capernicus

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 93
  • Country: au
Re: who uses fpga's
« Reply #129 on: April 13, 2021, 10:43:36 pm »
depends on how much estate is devoted to the gpu.
 

Offline emece67

  • Frequent Contributor
  • **
  • Posts: 341
  • Country: es
Re: who uses fpga's
« Reply #130 on: April 13, 2021, 11:20:55 pm »
Definitely, this has mutated into trolling. Bye.
Information must flow.
 
The following users thanked this post: james_s, Capernicus

Offline Daixiwen

  • Regular Contributor
  • *
  • Posts: 241
  • Country: no
Re: who uses fpga's
« Reply #131 on: April 14, 2021, 07:56:19 am »
You can emulate an fpga with a gpu.
It's in fact a lot easier to emulate an fpga with a cpu. It's called an hdl simulator.
 
The following users thanked this post: Bassman59

Offline dietert1

  • Frequent Contributor
  • **
  • Posts: 882
  • Country: de
    • CADT Homepage
Re: who uses fpga's
« Reply #132 on: April 14, 2021, 08:34:09 am »
Nonsense. We don't call this "emulation" but "simulation". It will be a factor 1000 or more slower than the real FPGA and completely useless except for debugging the VHDL description to some extent. Nobody will simulate a complete FPGA solution but only components.

Regards, Dieter
« Last Edit: April 14, 2021, 08:36:32 am by dietert1 »
 

Offline emece67

  • Frequent Contributor
  • **
  • Posts: 341
  • Country: es
Re: who uses fpga's
« Reply #133 on: April 14, 2021, 10:15:21 am »
Nobody will simulate a complete FPGA solution but only components.

Nonsense, people are simulating complete FPGA solutions since many, many years ago, even whole systems with various FPGA. Any design subject to certification needs to be fully simulated (as a whole, not its parts) and the pre-synthesis and the back-annotated post P&R simulations need to match. And there are also people doing that even when there is no certification involved.

In case one needs to simulate long periods of time, there are techniques to accelerate such simulations (time-)scaling some elements of the architecture. But simulations running for days or weeks are not such uncommon in cases when you need to simulate minutes or hours of real time.

In a case we simulated a whole system with an arm core, memory controller, flash controller, cache & memories booting linux and launching an application that accessed in-house audio a video codecs all inside the same FPGA, no certification needed.

Incidentally, since some years ago there are people talking about the possibility of GPU accelerated HDL simulation.

Regards.
Information must flow.
 

Offline dietert1

  • Frequent Contributor
  • **
  • Posts: 882
  • Country: de
    • CADT Homepage
Re: who uses fpga's
« Reply #134 on: April 14, 2021, 10:50:11 am »
A simulation will never be an emulation, that was the wrong term and it represents a stupid idea. Even if you have a powerful simulator, it will still be a lot slower than the original. Of course, if you abuse an FPGA to simulate an ARM processor, the original will probably run faster and you have a chance of making an emulator for that FPGA. Everybody understands that FPGAs can be used to simulate or even emulate other types of logic. The other way around it's simulation only.

Regards, Dieter
 

Offline Daixiwen

  • Regular Contributor
  • *
  • Posts: 241
  • Country: no
Re: who uses fpga's
« Reply #135 on: April 14, 2021, 11:14:25 am »
should have put a smiley there....  my point to Capernicus was that if you want to have a gpu to emulate an FPGA, you'll run into the same set of problems than when simulating with a CPU. It may work in real time for some simple designs, but most of the time it will be a lot slower.

if you need to run an algorithm that GPUs are well optimized for, then by all means use a GPU. Trying to port it to an FPGA would probably be a waste of time and resources. But there are all sorts of algorithms FPGAs are better suited for, if you are looking for performance.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf