Author Topic: who uses fpga's  (Read 4794 times)

0 Members and 1 Guest are viewing this topic.

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 8120
  • Country: us
Re: who uses fpga's
« Reply #75 on: April 09, 2021, 11:54:44 pm »
How big a computer is a IBM1130 -  thats nifty that u got that working, so u could fit about 4-5 more cores on it if u wanted?

I don't know that IBM ever referred to the IBM 1130 as a 'minicomputer' but that's about what it was.  Maximum memory was 32K 16-bit words.  At first it was just a single user system and later in life it became a host for remote entry.  Sometimes it was a front end for the mainframes.  IBM was the largest customer and they used it on the factory floor for a number of operations.  First computer you could see over the top of and the first computer system to lease for less than $1000 per month.  I assume this was the 4K paper tape based system.  Our department was charged $8 per run hour for usage.  There was also a vector graphics display with light-pen function.  Useful for CAD involved with building airplanes.  We didn't have one at the time.  The internal (to the main desk cabinet) disk drive held 512,000 words on a removable cartridge.  It seemed like a lot way back when.  I never did fill one up.

There was another model, the IBM 1800 which was identical (conceptually) except for having 2 more instructions.  It had serious real-time capability and, where I worked, it was used for data collection from a supersonic high speed wind tunnel.  Lift, drag, that kind of thing.  To be fair, these were very small models because the window was only about 12" square, IIRC.  It had a gigantic Roots style supercharger to compress the air into a storage tank.  On the discharge side there was a valve styled just like a camera shutter so it could open quickly.  The mechanical guys tell me the structural calculations for the piping was rather involved.  For some reason, it was half-U shaped around the building and a supersonic slug of air hitting the elbows created interesting mathematics.

http://ibm1130.org/

https://en.wikipedia.org/wiki/IBM_1800_Data_Acquisition_and_Control_System

EVERYTHING necessary to build a replica is in the Functional Characteristics manual:

http://media.ibm1130.org/E0006.pdf

The processor had no hardware stack and, as a result, self-modifying code abounds.  The return address of a calling function is always stored in the first word of the called function.  So much for reentrancy!  An indirect branch through that location got the PC back to the instruction after the call.

I guess you had to be there ('71 or so) and be a college student to really appreciate what this machine could do.
« Last Edit: April 09, 2021, 11:57:25 pm by rstofer »
 
The following users thanked this post: Capernicus

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 92
  • Country: au
Re: who uses fpga's
« Reply #76 on: April 10, 2021, 12:36:24 am »
Had 64kbyte ram,  same as a commodore 64. :)   There's still ppl coding really good games on a c64 today, probably with emulators I see on u-tube. 

If u emulate that computer on an fpga, it goes the fpga clock speed, doesn't it, not the original clock speed, unless u down clock it.

Hotswappable hard-disc drives are cool, I always liked them,   Before thumb drives become popular I used to just bring my hard-disc to my friends house instead of burning cd's cause it was just easier for software transfer, because we had tonnes of games to copy. :)

How much pressure does it take to speed up air past supersonic?

Makes me think of an idea maybe pneumatic computers oscillate almost as quick as electrical ones if they are fully solid state.
Then the whole thing is just a hunk of plastic or fibreglass, dont need metal at all which has interesting fab potential.

 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 8120
  • Country: us
Re: who uses fpga's
« Reply #77 on: April 10, 2021, 01:49:13 am »
If u emulate that computer on an fpga, it goes the fpga clock speed, doesn't it, not the original clock speed, unless u down clock it.
Yes.  My emulation runs at 50 MHz and the original ran at about 400 kHz (ours had the low 3.6 us core so, perhaps, 278kHz).  I slowed down the IO devices to near factory speed (1000 cards/minute, 800 lines/min, 100 plotter steps/sec, 16 typewriter characters per second) to avoid the problem where the CPU hadn't executed enough instructions before the IO complete interrupt occurred.  It seems I have increased the speed but when I port the project to an Attix 7, I will reevaluate this.  There's a problem watching printer output go buy at 10,000 lines per minute.  It's off the screen before you can see what happened!
Quote
Hotswappable hard-disc drives are cool, I always liked them,   Before thumb drives become popular I used to just bring my hard-disc to my friends house instead of burning cd's cause it was just easier for software transfer, because we had tonnes of games to copy. :)
The disk wasn't really hot swappable but each group had their own dedicated disk that were stored in a rack between the two systems.
Quote
How much pressure does it take to speed up air past supersonic?
I have no idea but I seem to recall that the drive motor was 8,000 HP.  I may be way off on that, I didn't work over there very often.  I do know that we had to tell the utility before we could start the motor.  That and there was a jacking motor that kept the shafting in continuous rotation.  The idea was to prevent the shaft from warping from the loads imposed by the weight.
« Last Edit: April 10, 2021, 02:05:02 am by rstofer »
 
The following users thanked this post: Capernicus

Offline dmills

  • Super Contributor
  • ***
  • Posts: 1847
Re: who uses fpga's
« Reply #78 on: April 10, 2021, 11:21:40 am »
Pressures required for supersonic flow depends strongly on the nozzle geometry, in not always obvious ways...

Interestingly amateur built supersonic wind tunnels are an occasional thing, there was one described in "The amateur scientist" back before Scientific American turned crap.

IIRC the usual approach is a pressure vessel valved into a suitable nozzle geometry and exhausting into an evacuated tank, the key being that the system exhausts into vacuum so you are not having to push air out of the way to make your flow.

In an era of cheap 3D printing this is a much easier thing to build models for then it was back when carved wood was the way to play. 
 
The following users thanked this post: Capernicus

Offline jmelson

  • Super Contributor
  • ***
  • Posts: 1958
  • Country: us
Re: who uses fpga's
« Reply #79 on: April 10, 2021, 04:00:21 pm »

How much pressure does it take to speed up air past supersonic?

IIRC, 13 PSI in a properly shaped nozzle.  Maybe just a bit more for a plain orifice.

Jon
 

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 92
  • Country: au
Re: who uses fpga's
« Reply #80 on: April 10, 2021, 05:51:59 pm »
13 psi,   isnt standard air pressure at room temperature 14?  |O

Or do you mean when its entering a vacuum.     that is pretty efficient! =)
« Last Edit: April 10, 2021, 05:53:50 pm by Capernicus »
 

Offline gnuarm

  • Frequent Contributor
  • **
  • Posts: 890
  • Country: aq
Re: who uses fpga's
« Reply #81 on: April 11, 2021, 12:03:54 am »
13 psi,   isnt standard air pressure at room temperature 14?  |O

Or do you mean when its entering a vacuum.     that is pretty efficient! =)

1 ATM is 14.7 lb/sq in absolute.  Most often people talk about pressure which is relative to ambient and called gage or gauge pressure. 
Rick C.
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 14392
  • Country: us
Re: who uses fpga's
« Reply #82 on: April 11, 2021, 07:36:21 am »
Hotswappable hard-disc drives are cool, I always liked them,   Before thumb drives become popular I used to just bring my hard-disc to my friends house instead of burning cd's cause it was just easier for software transfer, because we had tonnes of games to copy. :)

Back when any kind of external hard drive was exotic and expensive, I used to take a bare 40MB IDE HDD over to my friends' places and we would plug it into the internal cable in our PCs to transfer data around, mostly games. Doesn't sound like much now, but 40MB was a fair amount of space in the mid 90s.
 
The following users thanked this post: Capernicus

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 8120
  • Country: us
Re: who uses fpga's
« Reply #83 on: April 11, 2021, 03:16:45 pm »
I remember buying a 500MB drive for $500 and considering it a virtual garbage can.  I could never fill it up.  This was around '90.  It came in handy when 386BSD Unix came out in '92.

Many years earlier (around '80?), I wrote some driver code for a 5 MB fixed drive (SCSI stuff) that cost about $5k.

Now we buy  1 TB drives for around $50.
 
The following users thanked this post: Capernicus

Offline fourfathom

  • Frequent Contributor
  • **
  • Posts: 625
  • Country: us
Re: who uses fpga's
« Reply #84 on: April 11, 2021, 03:39:36 pm »
I remember buying a 500MB drive for $500 and considering it a virtual garbage can.  I could never fill it up.  This was around '90.  It came in handy when 386BSD Unix came out in '92.

Many years earlier (around '80?), I wrote some driver code for a 5 MB fixed drive (SCSI stuff) that cost about $5k.

Now we buy  1 TB drives for around $50.

Scoff- youngsters!  My first hard drive was a 5MB for my original dual-floppy IBM PC clone (4.77 MHz clock).  The half-shoebox-size drive was a 10MB unit with bad sectors.  Ludicrously expensive, for me at least, but an amazing amount of space on that MSDOS machine.  In keeping with the topic, I actually used that PC to design PALs (Programmable Array Logic). PALs led to CPLDs, which led to Gate-Array ASICs and FPGAs.  I used PALs for general logic and state-machine designs.  I also used PROMs (programmable Read Only Memory) for state-machine work.  This was back mid - 1970's or early 80's.
 
The following users thanked this post: Capernicus

Offline ebastler

  • Super Contributor
  • ***
  • Posts: 3874
  • Country: de
Re: who uses fpga's
« Reply #85 on: April 11, 2021, 04:33:13 pm »
My first hard drive was a 5MB for my original dual-floppy IBM PC clone (4.77 MHz clock).  [...]
I actually used that PC to design PALs (Programmable Array Logic). [...]
This was back mid - 1970's or early 80's.

Given that the IBM PC came out in late 1981, and generic clones ony became available in 1983, mid 1980s seems more plausible? Also, the IBM PC XT came out with a built-in hard disk in 1983, so 1983/84 seems like a plausible year for getting a second-grade 5 MB disk.
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 4915
  • Country: 00
Re: who uses fpga's
« Reply #86 on: April 11, 2021, 04:56:28 pm »
My first hard drive was a 5MB for my original dual-floppy IBM PC clone (4.77 MHz clock).  [...]
I actually used that PC to design PALs (Programmable Array Logic). [...]
This was back mid - 1970's or early 80's.

Given that the IBM PC came out in late 1981, and generic clones ony became available in 1983, mid 1980s seems more plausible? Also, the IBM PC XT came out with a built-in hard disk in 1983, so 1983/84 seems like a plausible year for getting a second-grade 5 MB disk.

I recall getting a 20MB disk in 1984 or so...  you never forget a bill like that!  :D
 

Offline fourfathom

  • Frequent Contributor
  • **
  • Posts: 625
  • Country: us
Re: who uses fpga's
« Reply #87 on: April 11, 2021, 05:39:11 pm »
My first hard drive was a 5MB for my original dual-floppy IBM PC clone (4.77 MHz clock).  [...]
I actually used that PC to design PALs (Programmable Array Logic). [...]
This was back mid - 1970's or early 80's.

Given that the IBM PC came out in late 1981, and generic clones ony became available in 1983, mid 1980s seems more plausible? Also, the IBM PC XT came out with a built-in hard disk in 1983, so 1983/84 seems like a plausible year for getting a second-grade 5 MB disk.
Yeah, you're no doubt correct.  I was guessing -- there have been lots of years and more than a few employers along the way.  It was the era of the early clones, as I recall my first clone had a 4.77 / 10MHz clock-selector switch on the back.
 

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 92
  • Country: au
Re: who uses fpga's
« Reply #88 on: April 11, 2021, 07:18:28 pm »
Scoff- youngsters!  My first hard drive was a 5MB for my original dual-floppy IBM PC clone (4.77 MHz clock).  The half-shoebox-size drive was a 10MB unit with bad sectors.  Ludicrously expensive, for me at least, but an amazing amount of space on that MSDOS machine.  In keeping with the topic, I actually used that PC to design PALs (Programmable Array Logic). PALs led to CPLDs, which led to Gate-Array ASICs and FPGAs.  I used PALs for general logic and state-machine designs.  I also used PROMs (programmable Read Only Memory) for state-machine work.  This was back mid - 1970's or early 80's.

There's nothing new to digital logic is there,  its always been there and its always been the same. 
How big were the PALS you were working on, and how fast could you oscillate them?

The FPGA im looking at to buy is this https://au.mouser.com/ProductDetail/Intel-Altera/10AX048H3F34I2SG/?qs=Rv6LVDxB0ZouSsIL4OazgQ%3D%3D
a $3,200 australian dollar, one with 480,000 luts (4 input 2 output) on it.  its very expensive and I already doesnt look like ill be able to fit my whole design onto it fully unrolled.  (I want to unroll the whole thing!  >:D)

I havent got the fully finished logic schematic yet,  but its getting there.   Its a 3d geometry thing, full of repeated rotate operations.
I also have a plan for an FPGA raytracer as well I might do,  for casting heaps and heaps of rays, for global illumination and radiosity.

So expensive,  I wonder how hard these things are to make from scratch yourself,  seems like such a ripoff just for a grid of electronic lookup tables...  :P
« Last Edit: April 11, 2021, 07:24:42 pm by Capernicus »
 

Offline fourfathom

  • Frequent Contributor
  • **
  • Posts: 625
  • Country: us
Re: who uses fpga's
« Reply #89 on: April 11, 2021, 07:44:59 pm »
There's nothing new to digital logic is there,  its always been there and its always been the same. 
How big were the PALS you were working on, and how fast could you oscillate them?

The FPGA im looking at to buy is this https://au.mouser.com/ProductDetail/Intel-Altera/10AX048H3F34I2SG/?qs=Rv6LVDxB0ZouSsIL4OazgQ%3D%3D
a $3,200 australian dollar, one with 480,000 luts (4 input 2 output) on it.  its very expensive and I already doesnt look like ill be able to fit my whole design onto it fully unrolled.  (I want to unroll the whole thing!  >:D)

I havent got the fully finished logic schematic yet,  but its getting there.   Its a 3d geometry thing, full of repeated rotate operations.
I also have a plan for an FPGA raytracer as well I might do,  for casting heaps and heaps of rays, for global illumination and radiosity.

So expensive,  I wonder how hard these things are to make from scratch yourself,  seems like such a ripoff just for a grid of electronic lookup tables...  :P

I don't recall the specifics, but I probably used a 16R8 or something similar (.pdf]https://www.dataman.com/media/datasheet/Cypress/PAL[16L8_16R8].pdf).  The spec sheet says the clock-Q delay is about 25ns but I wouldn't have been running them that fast.

I could probably build you your FPGA at home, with a $50 Million investment, if you didn't mind waiting a few years.  The programming tools would be another million.  These things are fast and complicated.  That said, my current FPPGA needs are modest, with a $5 chip fitting the bill just fine: Lattice MACHX02 with 1200 LUTs and 1 PLL.  I'm clocking the internals at 100 MHz.
 
The following users thanked this post: Capernicus

Offline gnuarm

  • Frequent Contributor
  • **
  • Posts: 890
  • Country: aq
Re: who uses fpga's
« Reply #90 on: April 11, 2021, 08:31:52 pm »
So expensive,  I wonder how hard these things are to make from scratch yourself,  seems like such a ripoff just for a grid of electronic lookup tables...  :P

I haven't followed this thread enough to know what you are using them for, but maybe there are other ways to skin this cat.   

I think people mentioned the parallel processing capabilities of a GPU which comes with similar issues the massive FPGAs have such as large size and heat loads.  Another option is a multiprocessor that doesn't use so much power.  An array of 144 very simple CPUs on a chip that uses a maximum of about 1 Watt.  Each CPU has very limited resources but cranks at up to 700 MIPS.  They run about $10 in quantity and are not hard to interconnect.  Greenarrays GA144. 
Rick C.
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 92
  • Country: au
Re: who uses fpga's
« Reply #91 on: April 11, 2021, 08:56:39 pm »

I don't recall the specifics, but I probably used a 16R8 or something similar (.pdf]https://www.dataman.com/media/datasheet/Cypress/PAL[16L8_16R8].pdf).  The spec sheet says the clock-Q delay is about 25ns but I wouldn't have been running them that fast.

I could probably build you your FPGA at home, with a $50 Million investment, if you didn't mind waiting a few years.  The programming tools would be another million.  These things are fast and complicated.  That said, my current FPPGA needs are modest, with a $5 chip fitting the bill just fine: Lattice MACHX02 with 1200 LUTs and 1 PLL.  I'm clocking the internals at 100 MHz.

They arent that hard to make!  the only problem is getting them smaller I guess.

If you could get transistors under a millimetre maybe u could put down a 300x300 map of transistors, in not too much space,  that would give you 90,000 gates.  One step of my "physics model" is looking to be 60,000,  but I need 20 steps...

Home made ASIC->  https://www.assemblymag.com/articles/83109-placing-tiny-parts-precisely
« Last Edit: April 11, 2021, 11:30:57 pm by Capernicus »
 

Offline james_s

  • Super Contributor
  • ***
  • Posts: 14392
  • Country: us
Re: who uses fpga's
« Reply #92 on: April 12, 2021, 02:08:42 am »
There's nothing new to digital logic is there,  its always been there and its always been the same. 
How big were the PALS you were working on, and how fast could you oscillate them?

What new-ness would you expect? There are only so many ways you can manipulate a bit, and you only really need NAND gates technically, you can make any other gate you want out of those. Boolean logic is like basic arithmetic, the fundamentals were laid out a LONG time ago and haven't really changed. The only thing that changes is parts get smaller and faster and/or lower power consumption.

IIRC PALs had something roughly equivalent to 10-20 logic blocks and could be clocked up to around 20MHz or so depending on the family. They were used a lot for address decoding in microprocessor systems, I've seen PALs and GALs in early 90s computers and expansion boards. One of the compact Macs (SE?) was a notable use of them, replacing a bunch of glue logic on the motherboard.
 

Offline gnuarm

  • Frequent Contributor
  • **
  • Posts: 890
  • Country: aq
Re: who uses fpga's
« Reply #93 on: April 12, 2021, 02:31:45 am »
There's nothing new to digital logic is there,  its always been there and its always been the same. 
How big were the PALS you were working on, and how fast could you oscillate them?

What new-ness would you expect? There are only so many ways you can manipulate a bit, and you only really need NAND gates technically, you can make any other gate you want out of those. Boolean logic is like basic arithmetic, the fundamentals were laid out a LONG time ago and haven't really changed. The only thing that changes is parts get smaller and faster and/or lower power consumption.

Sure, logic was invented a long time ago, even before electronics.  Boole published his Laws of Thought book in 1854.  However, that does not equate to the simplifications you describe such as NAND gates being universal.  While you can make any logic function using NAND gates that doesn't mean it is the best way to do it.  We learned multiplexer logic in college and lo and behold, it showed up in FPGAs!  We learned multivalued logic which is based on a more general form of algebra called Post algebra, which even now sees very little use.  One FPGA vendor found that rather than a 4 input LUT, they could combine an AND gate and an XOR gate as a more general form of programmable logic, complete enough to implement adders and anything else.  It didn't pan out, not because it was a poor idea, but because it could not compete unless implemented in the same process as the other brands of FPGA.  Also, they had very little routing using the logic blocks for routing. 

My point being there are many ways to implement logic, programmable and otherwise.  So we need to remain flexible.  BTW, the typical 4 input LUT does not use gates at all in the mux.  They use transmission gates because of the speed issues with logic. 
Rick C.
  - Get 1,000 miles of free Supercharging
  - Tesla referral code - https://ts.la/richard11209
 

Offline SilverSolder

  • Super Contributor
  • ***
  • Posts: 4915
  • Country: 00
Re: who uses fpga's
« Reply #94 on: April 12, 2021, 02:57:04 am »
[...] We learned multivalued logic which is based on a more general form of algebra called Post algebra, which even now sees very little use. [...]

Are modern SSD drives with 3 or 4 valued cells (instead of binary logic values) an example of an application?
 

Offline Bassman59

  • Super Contributor
  • ***
  • Posts: 1756
  • Country: us
  • Yes, I do this for a living
Re: who uses fpga's
« Reply #95 on: April 12, 2021, 02:59:46 am »
I'm just a GPU nut, I've doing it with direct compute ever since I was 21 doing direct x 6 win32 tutorials watching Martin Short on tv back in the olden days possibly 10 years after inner-space came out.  I loved that movie, it was absolute magic and the robot suit's were fricken amazing! and as I think back terrify the absolute shit out of me now if it ever became a reality. curse the singularity.

So the biggest mystery to me now is, now I'm old,40 sick still feel and look like an immature teenager and just about carked it in the head, what the hell are FPGA's and who uses them -and what do they use them for?

So, four pages, and as typical, nobody's answered your question.

I'll just say upfront that I've been working with FPGAs (and CPLDs, and PALs/GALs and whatnot) since the days of the Xilinx XC3000, when we used to use ViewLogic for schematic entry and HDLs and synthesis did not exist. Those days were horrible and it was common to make a design change and let the tools run overnight and hopefully you didn't make a mistake. Like working with PALs, you had to do your own logic minimization.

So, anyway. It seems like all of my FPGA designs are of the "we need a boatload of logic and it needs to fit in this small space." And it's all of the sort of thing where a sequentially-execution processor doesn't make sense for the design. Most things now are controlling a sensor (providing it digital signals from a state sequencer), capturing data from that sensor (whether directly from the sensor's built-in ADCs or with external ADCs) and buffering and reordering it, controlling biases for the sensor, running a temperature-control loop for the sensor, reading housekeeping (voltages, temperature, vacuum, whatever), an interface for FRAM which is used as local parameter store, and managing a gigabit communications link to a remote server, and that includes a command parser and fast data path. Because of the environment in which the product is used, there is also a LOT of test logic built in.

Some of the stuff could be done in a standard processor, I suppose, but finding one that fits the bill for all of the various things is tough. Pick an ARM, any ARM. It has a fixed number of serial interfaces (UART, USART, SPI, I2C, whatever) but what if what's offered is not sufficient? For example, the most recent design has I think a dozen SPI and SPI-like interfaces. Rather than trying to figure out how to make Weird Sensor's "SPI" port work with a micro, I design the exact interface necessary.

Or, there's a dozen digital pots in the design, and to save pins on a feedthrough, we did something which is basically like QSPI but not really. There are six pairs of digital pots. Each pot needs an 8-bit control setting. Each pair is in series, so 16 bits have to be shifted out to load both. And all of the pairs are in parallel, so they share common SCLK and CS\ and there are six parallel MOSI lines. (We don't read them back, so there's no mux or other logic to deal with the MISO.) I'd probably slit my wrists if I had to do that in a micro, but in the FPGA it was trivially easy. Oh, yeah, we decided to save even more feedthrough pins

Another thing. A local server will send commands up over the gigabit fiber. The FPGA captures the commands (a four-byte packet) and decodes them. There are 256 possible commands (the command token is one byte). The cool thing about the FPGA command decoder is that it happens in parallel. That is, the time from receipt of the command token to the execution of the command is the same for ALL commands (and is very fast, like two clock ticks). Do that in a sequential-execution processor with a big CASE statement. How many compares does it need to do for the last command in the list?

And so on. The FPGA is basically just one big pile of logic resources you can use as your design dictates. You can even embed a processor core in your FPGA, for things which benefit from one (like, say, an Ethernet stack). I've done Xilinx MicroBlaze, Xilinx' Virtex-4 with the hard PowerPC, and I'm going to use a RISC-V in a new thing. The point is that the processor is "just another logic block" to the FPGA: you can do the things that "make sense" to do in the processor in a processor, and you can do the specialist stuff in the FPGA.

Quote
But has anyone here got some real experience with them - is there any forums to go to get tips off guys that have been doing it for ages?

Back in the day, there was the Usenet newsgroup comp.arch.fpga, which was exactly what you're asking for. I don't know if it still exists.

Good luck.
 
The following users thanked this post: Capernicus

Offline Bassman59

  • Super Contributor
  • ***
  • Posts: 1756
  • Country: us
  • Yes, I do this for a living
Re: who uses fpga's
« Reply #96 on: April 12, 2021, 03:01:11 am »
[...] We learned multivalued logic which is based on a more general form of algebra called Post algebra, which even now sees very little use. [...]

Are modern SSD drives with 3 or 4 valued cells (instead of binary logic values) an example of an application?

Multi-value memory cells, like the various "constellations" of digital modulation, are more about packing more information in a given space than any particular application of the data.
 

Offline 1design

  • Regular Contributor
  • *
  • Posts: 145
Re: who uses fpga's
« Reply #97 on: April 12, 2021, 03:12:12 am »
In my line of work FPGAs are used for two very specific services that no other GPU/CPU does well:
-Receive market data and trade faster than any ETH stack and processor pair will ever be able to, I can't go into the details, just google HFT.
-Be the home of custom ultra low latency modems, again, I can't go too deep into how/what. You can google Sniper in Mahwah.

Both are constantly updated/improved and an ASIC would be obsolete by the time it is done and both need to achieve performance levels that are not achievable in SW or GPU.

The point is that there are plenty of industries/services out there, where ASIC product cycles are too long and SW is too slow. Look at defence, space, communications, physics(particle accelerators) etc.
 

Offline fourfathom

  • Frequent Contributor
  • **
  • Posts: 625
  • Country: us
Re: who uses fpga's
« Reply #98 on: April 12, 2021, 03:39:54 am »
So, four pages, and as typical, nobody's answered your question.

That's pretty harsh.  I think the question has been answered many times, from multiple perspectives.  Your answer is another good one.
 

Offline Capernicus

  • Regular Contributor
  • *
  • Posts: 92
  • Country: au
Re: who uses fpga's
« Reply #99 on: April 12, 2021, 03:50:01 am »

I'll just say upfront that I've been working with FPGAs (and CPLDs, and PALs/GALs and whatnot) since the days of the Xilinx XC3000, when we used to use ViewLogic for schematic entry and HDLs and synthesis did not exist. Those days were horrible and it was common to make a design change and let the tools run overnight and hopefully you didn't make a mistake. Like working with PALs, you had to do your own logic minimization.

So, anyway. It seems like all of my FPGA designs are of the "we need a boatload of logic and it needs to fit in this small space." And it's all of the sort of thing where a sequentially-execution processor doesn't make sense for the design. Most things now are controlling a sensor (providing it digital signals from a state sequencer), capturing data from that sensor (whether directly from the sensor's built-in ADCs or with external ADCs) and buffering and reordering it, controlling biases for the sensor, running a temperature-control loop for the sensor, reading housekeeping (voltages, temperature, vacuum, whatever), an interface for FRAM which is used as local parameter store, and managing a gigabit communications link to a remote server, and that includes a command parser and fast data path. Because of the environment in which the product is used, there is also a LOT of test logic built in.

Some of the stuff could be done in a standard processor, I suppose, but finding one that fits the bill for all of the various things is tough. Pick an ARM, any ARM. It has a fixed number of serial interfaces (UART, USART, SPI, I2C, whatever) but what if what's offered is not sufficient? For example, the most recent design has I think a dozen SPI and SPI-like interfaces. Rather than trying to figure out how to make Weird Sensor's "SPI" port work with a micro, I design the exact interface necessary.

Or, there's a dozen digital pots in the design, and to save pins on a feedthrough, we did something which is basically like QSPI but not really. There are six pairs of digital pots. Each pot needs an 8-bit control setting. Each pair is in series, so 16 bits have to be shifted out to load both. And all of the pairs are in parallel, so they share common SCLK and CS\ and there are six parallel MOSI lines. (We don't read them back, so there's no mux or other logic to deal with the MISO.) I'd probably slit my wrists if I had to do that in a micro, but in the FPGA it was trivially easy. Oh, yeah, we decided to save even more feedthrough pins

Another thing. A local server will send commands up over the gigabit fiber. The FPGA captures the commands (a four-byte packet) and decodes them. There are 256 possible commands (the command token is one byte). The cool thing about the FPGA command decoder is that it happens in parallel. That is, the time from receipt of the command token to the execution of the command is the same for ALL commands (and is very fast, like two clock ticks). Do that in a sequential-execution processor with a big CASE statement. How many compares does it need to do for the last command in the list?

And so on. The FPGA is basically just one big pile of logic resources you can use as your design dictates. You can even embed a processor core in your FPGA, for things which benefit from one (like, say, an Ethernet stack). I've done Xilinx MicroBlaze, Xilinx' Virtex-4 with the hard PowerPC, and I'm going to use a RISC-V in a new thing. The point is that the processor is "just another logic block" to the FPGA: you can do the things that "make sense" to do in the processor in a processor, and you can do the specialist stuff in the FPGA.

Quote
But has anyone here got some real experience with them - is there any forums to go to get tips off guys that have been doing it for ages?

Back in the day, there was the Usenet newsgroup comp.arch.fpga, which was exactly what you're asking for. I don't know if it still exists.

Good luck.

Thanks for the info about your work!   
Ive actually learnt alot since the beginning of this thread, Ive actually downloaded the HDL IDE now, and Im getting some stuff tested in visual studio ready to convert over to the LUT format.

GPU vs FPGA - hard to say whats better out of a DEVBOARD and a GPU talking through an arduino or other micro.

I actually dont know if FPGA is going to win over GPU for my task yet,  I think it will, but I havent actually got it emulating yet to know for sure.    Im looking at a $3000 FPGA for my job!!! so its not cheap at all, u can buy a GTX3080 for that price, I know from my experience how well the GPU is going to go, and for the FPGA to beat it, its going to be about cutting down precision and getting rid of as many complex operations as I can.     

I know what u mean about having to optimize it to hell to fit it in the FPGA!!!  Getting it one cycle is really tough, and it needs to be or I may as well whip out my GPU and just run it with that.

I'm right in the middle of the big optimization job,   I've got a big system I'm chopping down bit by bit to see how small I can get the instructions without ruining the output too much.   Its 3d geometry, its like a 3d game, except I cant get hypotenuses because it requires square root, I'm not using matrix multiply and using very little trigonometry.    Its going good!! I think I'm going to make it, I just need a bit more time.. I think alot ppl with less drive would have given up by now and just resorted to vector matrix math and dot and cross products, instead of making it original.

As I made this thing, I accidentally worked out on the side how I could get a 3d fractal to render on an FPGA with global illumination,  it really can beat the GPU! ray count. but you need to cater it alot to more simpler methods, I think.  More like how they coded PS1 games instead of DX11 games.

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf