Author Topic: current PCB design best practices are stone-age  (Read 12345 times)

0 Members and 1 Guest are viewing this topic.

Offline FrankBussTopic starter

  • Supporter
  • ****
  • Posts: 2365
  • Country: de
    • Frank Buss
current PCB design best practices are stone-age
« on: May 25, 2016, 07:06:10 am »
I just watched an interesting presentation about new ideas for circuit and PCB design. He created small building blocks and then wrote some Python scripts to automatically create a Spice model, run a simulation and verify output values, similar to test driven development in software engineering. He can combine the blocks to larger blocks as well and write a test case for it. And other scripts automatically places the blocks on a PCB and connects them according to some higher level description.

This reminds me very much to VHDL (hence the title of the presentation, "reinventing VHDL badly"). Back in the old days you had to design logic with lots of logic gates. Later there were PLAs, GALs etc. and ABEL, which made it easier and you didn't have to care about placement and connections anymore. Now we have VHDL and Verilog, and powerful FPGAs, which allows even more high-level descriptions of circuits. And nobody cares about the placement of the gates anymore, or how the lookup tables are configured to implement the function (except for some special cases, like high speed designs, where you want to place some functions near the pins).

PCB design is still pre-PLA, where you have to place and route individual resistors etc., at least with the programs I know like Eagle or Altium Designer. Is there a professional PCB EDA program, where you can easily create blocks, test it with test cases in a circuit simulator, combine it to bigger blocks and automatically place all components on a board and auto-route it (like VHDL synthesize), including constraints, like don't route the big power supply trace near the 24 bit ADC inputs (like VHDL timing constraints) ?

It would be nice to have something like VHDL for circuits. You would describe a circuit in a high-level language, then hit the compile button and you get the routed board. Write a testbench and hit the simulate button to create a Spice circuit, simulate it and the testbench verifies it.
So Long, and Thanks for All the Fish
Electronics, hiking, retro-computing, electronic music etc.: https://www.youtube.com/c/FrankBussProgrammer
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #1 on: May 25, 2016, 07:16:11 am »
It would be nice to have something like VHDL for circuits. You would describe a circuit in a high-level language, then hit the compile button and you get the routed board.
Auto routers and auto placers existed for decades. The quality of the output varies, but so does the quality of VHDL synthesis tools.

For a tight and heavy FPGA project you will do a lot of manual placement and baby sitting the placer.
Alex
 

Offline FrankBussTopic starter

  • Supporter
  • ****
  • Posts: 2365
  • Country: de
    • Frank Buss
Re: current PCB design best practices are stone-age
« Reply #2 on: May 25, 2016, 12:19:09 pm »
I worked on a FPGA project with a big Cyclone IV, which was over 90% full, with 100 MHz signals, and didn't need to do any manual placement. And there was never a problem with the VHDL synthesis, when my testbench worked and all timing constraints were right (except occasional bugs of the tools). Of course, it is not a Stratix or Virtex and not very fast. I guess this can change with really complex projects with faster signals. I know of only one project where manual placing was necessary for some high speed serial decoder near the pins.

Do you have examples for auto placers? Looks like Cadence can do it, even with constraints to do something useful for analog components, but I've never used this program. I know, some people here hate auto-routers and auto-placers and think it is some kind of art and not just another mundane engineering task that will be automated someday :)
https://www.eevblog.com/forum/suggestions/future-content-suggestion-auto-placement-(not-auto-routing)/
So Long, and Thanks for All the Fish
Electronics, hiking, retro-computing, electronic music etc.: https://www.youtube.com/c/FrankBussProgrammer
 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 26754
  • Country: nl
    • NCT Developments
Re: current PCB design best practices are stone-age
« Reply #3 on: May 25, 2016, 03:11:01 pm »
IMHO it depends a lot on how easy these tools are to drive. In many case you'll need to set a lot of parameters which come into play when creating a PCB like controlled impedances, maximum crosstalk coupling, currents, heat dissipation, maximum resistance between components, etc, etc. By the time you have set these you are probably far along by just placing the components by hand.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline batteksystem

  • Regular Contributor
  • *
  • Posts: 167
  • Country: hk
    • My ebay store
Re: current PCB design best practices are stone-age
« Reply #4 on: May 25, 2016, 03:17:01 pm »
One problem with the auto router is it require too much configuration to provide an acceptable output. This is because different from logic, when we place components they have different kind of characteristics which is not understood by the auto router. Therefore, you need to input a lot of information to tell the auto router like what "class" the trace is belong, how width it need to be, and how far or how tight a group of decals need to be put close to each other.

On the other hand, our brains already know all this and can place them directly without thinking. I think auto router do have benefit when you are routing >1000 traces.

Offline tszaboo

  • Super Contributor
  • ***
  • Posts: 7307
  • Country: nl
  • Current job: ATEX product design
Re: current PCB design best practices are stone-age
« Reply #5 on: May 25, 2016, 03:37:51 pm »
http://www.altium.com/video-how-employ-snippets-design-reuse
I will just put this link here, so it does not get lost.
 

Offline FrankBussTopic starter

  • Supporter
  • ****
  • Posts: 2365
  • Country: de
    • Frank Buss
Re: current PCB design best practices are stone-age
« Reply #6 on: May 25, 2016, 03:59:02 pm »
IMHO it depends a lot on how easy these tools are to drive. In many case you'll need to set a lot of parameters which come into play when creating a PCB like controlled impedances, maximum crosstalk coupling, currents, heat dissipation, maximum resistance between components, etc, etc. By the time you have set these you are probably far along by just placing the components by hand.
Many of these parameters could be set automatically. For example you could create a testbench with the required input voltage and the software could even choose a 1206 resistor because of the higher heat dissipation and chose 16 mil traces instead of 10 mil. Might be useful to override the defaults for some cases, but it should be possible to automate many things.
So Long, and Thanks for All the Fish
Electronics, hiking, retro-computing, electronic music etc.: https://www.youtube.com/c/FrankBussProgrammer
 

Offline FrankBussTopic starter

  • Supporter
  • ****
  • Posts: 2365
  • Country: de
    • Frank Buss
Re: current PCB design best practices are stone-age
« Reply #7 on: May 25, 2016, 04:03:42 pm »
http://www.altium.com/video-how-employ-snippets-design-reuse
I will just put this link here, so it does not get lost.
This looks painful. 6 minutes to use a snipped, manual renumbering etc. Imagine you would have a Verilog for circuits and you could just write "include powersupply; supply1=new 5Vsupply(); supply1.in = header-in; supply1.gnd = header-gnd; supply1.out=amplifier1.vcc;"  in a few seconds, hit "compile" and you get a nicely drawn circuit and the fully layouted board.
So Long, and Thanks for All the Fish
Electronics, hiking, retro-computing, electronic music etc.: https://www.youtube.com/c/FrankBussProgrammer
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #8 on: May 25, 2016, 04:14:01 pm »
I worked on a FPGA project with a big Cyclone IV, which was over 90% full, with 100 MHz signals, and didn't need to do any manual placement.
I never had to do it myself, but I've seen Xilinx MicroBlaze source code and quite literally everything there is manually tied to specific locations. MicroBlaze is a reusable component, of course, but still. I've also worked with a team of people doing highly structured designs (like banks of filters, correlators, specialized signal processing stuff) and they always did manual placement of main components for this stuff, since automatic placer was not good enough.

And there was never a problem with the VHDL synthesis, when my testbench worked and all timing constraints were right (except occasional bugs of the tools).
VHDL deals with relatively simple basic blocks and there is no real penalty for using a bit more than needed, you don't pay extra (unless you have to use a bigger FPGA, of course). In a real world you pay extra for every component and making automated tool smart enough is just not going to happen.

Do you have examples for auto placers?
Not really. I never used them, but I've seen one as early as P-CAD 4.5 (for DOS), so I assume in one way or the other all tools like this must have this functionality.
Alex
 

Offline tszaboo

  • Super Contributor
  • ***
  • Posts: 7307
  • Country: nl
  • Current job: ATEX product design
Re: current PCB design best practices are stone-age
« Reply #9 on: May 25, 2016, 04:40:13 pm »
http://www.altium.com/video-how-employ-snippets-design-reuse
I will just put this link here, so it does not get lost.
This looks painful. 6 minutes to use a snipped, manual renumbering etc. Imagine you would have a Verilog for circuits and you could just write "include powersupply; supply1=new 5Vsupply(); supply1.in = header-in; supply1.gnd = header-gnd; supply1.out=amplifier1.vcc;"  in a few seconds, hit "compile" and you get a nicely drawn circuit and the fully layouted board.
I dont think I would like for any machine to automatize my job (robots, everyone run!).
There are few automatizable parts of a schematic. I never had dozens of opamps in series, doing filtering. I had complicated designs though, where the architechture phase was months to complete, even after we know all the building block possible.
I dont know what part of the layout you want to automate. The design I'm working on now has 3G modem and DDR ram. My previous job had LTZ1000 and high power opamp which was dissipating heat. Even before that I was re-designing a PCB for lower cost. Tell that to a machine. And then there is DFM, DFT and everything else, which I'm really not sure how would you translate.
I never had 10000 uniform "whatever" component to be routed.
 

Offline T3sl4co1l

  • Super Contributor
  • ***
  • Posts: 21606
  • Country: us
  • Expert, Analog Electronics, PCB Layout, EMC
    • Seven Transistor Labs
Re: current PCB design best practices are stone-age
« Reply #10 on: May 26, 2016, 04:15:43 am »
This looks painful. 6 minutes to use a snipped, manual renumbering etc. Imagine you would have a Verilog for circuits and you could just write "include powersupply; supply1=new 5Vsupply(); supply1.in = header-in; supply1.gnd = header-gnd; supply1.out=amplifier1.vcc;"  in a few seconds, hit "compile" and you get a nicely drawn circuit and the fully layouted board.

[Jokingly,]

Ahh, you digi-kiddies are so naive. You have it easy: a digital signal propagates down a wire, and dead-ends.  It goes into some gates, and that's end of story.

Real white-beard burly-man analog signals go back and forth.  Wires obey reciprocity: there is never action without reaction.  You can't simply place a power supply, without knowing how much current might be drawn from it!  You can't place an RF amplifier, without a specified system impedance!  You can't tune a filter without a defined frequency!

...But seriously, it would seem the sheer magnitude of complexity, and relative lack of demand ("screw it, we'll just throw it in a DSP!"), has obviated attempts at synthesized analog circuitry.  At least, that we know of...

They even tried analog computers, where between the frequent use of buffering amplifiers, and circuits with lightly-loading inputs, they can avoid reciprocity for the most part.  It didn't last long, though.  (Scarily: mechanical computers had been in use for quite some time as well, and usually have to obey reciprocity even more strongly than analog circuits!  Designing some of those beasts must've been such a chore, nevermind machining the thousands of tiny parts needed to build them!)

FWIW, IIRC, Verilog or one of the HDLs can specify analog circuitry just fine.  Perhaps more to the point: SPICE netlists specify circuitry just fine, too.  But AFAIK, neither one provides much semantic assistance, beyond compartmentalizing building blocks (blocks, instances, subcircuits, whatever) and basic type checking.  Strong typed languages are one thing, but the equivalent for analog circuitry must include the source's load characteristics as well as the load's sourcing characteristics!

A little more background on reciprocity:
Newton's law (action and reaction) is a succinct statement of the mechanical version.  More broadly, it's a conservation law; a symmetry of the medium.  In circuits, we have Kirchoff's laws.  In any case, any input and outputs from a system (whether it's a black box full of cogs with shafts on the outside, or a black box full of components with wires on the outside) must obey these rules.

Now, if we have an external source of power, we can conditionally divert some of the input or output "force" (force, torque, pressure, voltage, whatever..) to the supply, and violate reciprocity between any remaining set of shafts.

A digital logic gate does this by drawing supply current to force the output around.  In turn, some of the output voltage is fed back to the input, through various physical means (most often capacitance).  Which means, a change in the load characteristic can still have some effect on the input characteristic, though less than if the gate were not present.  Example: a 74HCU04 (unbuffered, so, a single CMOS inverter stage) will read less input capacitance when the output is shorted (held at a constant voltage), than when it's allowed to move on its own (where Miller effect takes place).  But a regular 74HC04's input characteristic will change only imperceptibly, because it contains three inverter stages in cascade.  Each stage is nonreciprocal (allowed because output power is sourced from the supplies rather than the input), but not completely so; for N stages in cascade, the nonreciprocity goes as R^-N, so if the one stage is at least modest, then feedback, from the output proper, very quickly becomes negligible!

This is what's so easy about digital logic.  You can click pieces together without a care in the world for anything going "backwards".  If nothing else, the synthesizer knows to keep limits on the number of gates connected together, and to add buffers as needed, to transparently maintain that rule for you.

A lot of analog circuitry is about this, too, though.  One of the challenges of RF circuitry is building an amplifier with good isolation, i.e., low backwards feedback.  Indeed, if backwards feedback were zero, the available (stable) gain from a stage could be infinite!  Instead, such circuits oscillate, or are only stable with very particular input and load impedances.  It's desirable to have isolation between antennae (so internal signals don't go back out on the air), filters (so the filter's frequency response isn't upset by reflected power), mixers (so the undesired frequencies don't go where they're unwanted), and so on.

Tim
Seven Transistor Labs, LLC
Electronic design, from concept to prototype.
Bringing a project to life?  Send me a message!
 

Offline batteksystem

  • Regular Contributor
  • *
  • Posts: 167
  • Country: hk
    • My ebay store
Re: current PCB design best practices are stone-age
« Reply #11 on: May 26, 2016, 06:10:38 am »
http://www.altium.com/video-how-employ-snippets-design-reuse
I will just put this link here, so it does not get lost.
This looks painful. 6 minutes to use a snipped, manual renumbering etc. Imagine you would have a Verilog for circuits and you could just write "include powersupply; supply1=new 5Vsupply(); supply1.in = header-in; supply1.gnd = header-gnd; supply1.out=amplifier1.vcc;"  in a few seconds, hit "compile" and you get a nicely drawn circuit and the fully layouted board.
I dont think I would like for any machine to automatize my job (robots, everyone run!).
There are few automatizable parts of a schematic. I never had dozens of opamps in series, doing filtering. I had complicated designs though, where the architechture phase was months to complete, even after we know all the building block possible.
I dont know what part of the layout you want to automate. The design I'm working on now has 3G modem and DDR ram. My previous job had LTZ1000 and high power opamp which was dissipating heat. Even before that I was re-designing a PCB for lower cost. Tell that to a machine. And then there is DFM, DFT and everything else, which I'm really not sure how would you translate.
I never had 10000 uniform "whatever" component to be routed.

On the other hands, there are definitely semi-automation like you can repeat some of the hand-drawn schematics (example: input stage, output stages of a mutli-channel data acquisition device). However, it is only possible if you standardized that circuit, while majority of the circuit have to be thought and drawn by engineer which would be and need to be different each time.

Online Ian.M

  • Super Contributor
  • ***
  • Posts: 12807
Re: current PCB design best practices are stone-age
« Reply #12 on: May 26, 2016, 06:44:13 am »
Then you've got stuff like unexpected electromagnetic, thermal, and even mechanical interactions between components.   The EM stuff can be modelled, but the component manufacturers usually don't provide enough data to do so.  The thermal stuff is trickier - without a good model of the airflow over the board, and a deep understanding of what components and subsections are thermally sensitive, its impossible to make intelligent placement decisions.  Mechanical issues are extremely intractable.  e.g. you have a relay on the board or SMPSU magnetics, and you have sensitive nodes with large ceramic capacitors, you may have to switch to a capacitor type that has a less pizeoelectric dielectric if it is being excited by vibrations from the relay or magnetics.   Move a mounting hole half an inch, add a heavy or large area component, or change the switching frequency slightly and the mechanical vibration nodes and antinodes move round the board unpredictably, changing which capacitors will be affected by the vibration.

A GOOD human designer is aware of potential interactions, and tries to minimise their effect on the circuit.  Short of a revolutionary breakthrough in true A.I, I don't expect it will ever be possible to automate this.
 

Offline John Coloccia

  • Super Contributor
  • ***
  • Posts: 1208
  • Country: us
Re: current PCB design best practices are stone-age
« Reply #13 on: May 26, 2016, 08:25:23 am »
I never had to do it myself, but I've seen Xilinx MicroBlaze source code and quite literally everything there is manually tied to specific locations.

Xilinx's software generates all of that for you. It's actually quite slick.
 

Offline obiwanjacobi

  • Frequent Contributor
  • **
  • Posts: 988
  • Country: nl
  • What's this yippee-yayoh pin you talk about!?
    • Marctronix Blog
Re: current PCB design best practices are stone-age
« Reply #14 on: May 26, 2016, 08:58:34 am »
I had worked with Orcad in the 80's and when I took up electronics a couple of years ago and looked at Eagle, my heart sank. Nothing had changed!

I would envision a model-based tool where all relevant information is specified in related models. The current way of working (for most tools/all tools I know) mix too much different information domains in the same model, for example the schematic. Why not have a logical schematic that has the entire circuit spelled out. Then a related model that specifies constraints on those nets and a separate model that defines how the circuit is 'deployed' - how many pcb's how they are connected etc, - the physical schematic. Case design etc could be integrated. A model for simulation (input stimuli, output validation), one for thermal stuff, a model for footprints etc. Then the design of the individual PCBs that can draw on all these models to determine how to layout the components.

This is how I would do it...
[2c]
Arduino Template Library | Zalt Z80 Computer
Wrong code should not compile!
 

Online tggzzz

  • Super Contributor
  • ***
  • Posts: 19280
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: current PCB design best practices are stone-age
« Reply #15 on: May 26, 2016, 09:17:05 am »
This looks painful. 6 minutes to use a snipped, manual renumbering etc. Imagine you would have a Verilog for circuits and you could just write "include powersupply; supply1=new 5Vsupply(); supply1.in = header-in; supply1.gnd = header-gnd; supply1.out=amplifier1.vcc;"  in a few seconds, hit "compile" and you get a nicely drawn circuit and the fully layouted board.

Ah. Someone that thinks their circuits are digital and that wires are wires and that ground exists. The former is only true in femtoamp and photon-counting circuits, and the latter only has any validity under very restricted circumstances.

As an illustration that wires are more complex than most people imagine, what's the transfer function between each of the connectors in this circuit: http://paginas.fe.up.pt/~hmiranda/etele/microstrip/hybrid_ring_cpl_1.jpg (assume a solid groundplane on the underside of the PCB).

And, obviously google for terms such as "signal integrity" "ground bounce" "EMI/EMC", and then read http://www.thehighspeeddesignbook.com/

The core problems are
  • models (e.g. as expressed in an HDL or Spice) are models; by definition they are a simplification that omits many details
  • even if the CAD systems could take all constraints into account, it will take a very long time to specify all the necessary constraints and to ensure the constraints do permit a solution
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #16 on: May 26, 2016, 04:04:53 pm »
Xilinx's software generates all of that for you. It's actually quite slick.
Yes, but behind the scenes, there was a Xilinx engineer who manually placed logic and registers in specific slices. This part is typically encrypted and closed from your eyes.
Alex
 

Offline rfeecs

  • Frequent Contributor
  • **
  • Posts: 807
  • Country: us
Re: current PCB design best practices are stone-age
« Reply #17 on: May 26, 2016, 09:51:31 pm »
Keysight ADS does co-simulation of Schematic, Layout, EM and Electro-thermal simulation:
https://youtu.be/ChZcUpH0rmk

National Instruments Microwave Office is similar.

I believe several of the high end simulators have these capabilities.
 

Offline altaic

  • Supporter
  • ****
  • Posts: 45
Re: current PCB design best practices are stone-age
« Reply #18 on: May 26, 2016, 10:05:33 pm »
The thoughts and arguments in this thread come up time and time again, but it seems everyone focusses on the tool being crap. Tools for FEM/FEA [1] are not crap, and good ones have existed for ages; automatic EM, thermal, and mechanical analysis is not impossible, nor improbable even in the next few years. It will be a great day when FEM, device simulation, and auto-routing come together in a sensible manner. But, as Ian.M pointed out, the crux here is that device manufacturers generally do not provide good enough models, or even decent data sheets in most cases.

The solution is obvious: force manufacturers produce good models. How? Make better tools for producing good models cheaply. Then let capitalism do its thing: the value of the better (cheaper, faster, and higher quality) auto-routing process rewards supporting manufacturers with business, and punishes non-supporting manufactures with loss of business. Alternatively, 3rd parties may produce good models and sell them.

As other processes (e.g. Occam [2]) emerge, manual layout will become much more difficult and the benefits of automatic layout will be overwhelmingly clear. I see automatic layout becoming the conventional practice as an eventuality, not a mere possibility as most suggest.

[1] https://en.wikipedia.org/wiki/Finite_element_method
[2] https://en.wikipedia.org/wiki/Occam_process
« Last Edit: May 26, 2016, 10:07:53 pm by altaic »
 

Offline John Coloccia

  • Super Contributor
  • ***
  • Posts: 1208
  • Country: us
Re: current PCB design best practices are stone-age
« Reply #19 on: May 27, 2016, 12:03:47 am »
Xilinx's software generates all of that for you. It's actually quite slick.
Yes, but behind the scenes, there was a Xilinx engineer who manually placed logic and registers in specific slices. This part is typically encrypted and closed from your eyes.

I'm afraid I don't understand what you mean. At some point, something has to put everything somewhere, and presumably someone had to write some sort of compiler/synthesis/whatever tool to do it. Actually, if you look at the output from the synthesis it's different every time. Normally, you save the output as part of your source control because it's not guaranteed to be the same (and usually isn't), and you don't want to risk some timing problem or other problem you overlooked creeping into your system.

You can't just toss a pile of silicon in a bucket and expect transistors to pop out because you will it. I don't see why someone would expect to toss a bunch of components at a computer and expect it to figure out your design intent either. Yeah, you can write a bunch of rules to explain the intent, and maybe that's even useful for some limited, repetitive or very similar tasks, much like software engineers do, but often it will be simpler and quicker to just do the task than try to explain how to do the task. It's not like the engineers behind these products are stupid. It's a far more difficult problem than you might imagine.
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #20 on: May 27, 2016, 12:08:47 am »
Actually, if you look at the output from the synthesis it's different every time.
Not in case of Microblaze. The MicroBlaze source code  fixes registers to specific locations (relative, not absolute).

It is written in VHDL, but that code is actually full of explicit placement directives and direct instantiations of primitives. There is not a single MicroBlaze code, there is one for each FPGA architecture. The right one is selected at the time you put it into the project.
Alex
 

Offline John Coloccia

  • Super Contributor
  • ***
  • Posts: 1208
  • Country: us
Re: current PCB design best practices are stone-age
« Reply #21 on: May 27, 2016, 01:45:30 am »
You're configuring a specific piece of hardware to make a processor. Do you expect that the best processor you can make comes from randomly scattering stuff all over the place? Is that how Intel makes a processor?

There's a huge difference between software and hardware. When you're dealing with hardware, you're dealing with physical devices that have to respect the laws of physics. With software, I have the option of basically designing the rules, and as long as I design good rules and apply them properly, everything works. With hardware, the laws of physics are the rules, and you must respect them.
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #22 on: May 27, 2016, 01:50:31 am »
You're configuring a specific piece of hardware to make a processor. Do you expect that the best processor you can make comes from randomly scattering stuff all over the place? Is that how Intel makes a processor?
If this is directed at me, then no, I don't expect this.

But they could have easily created it in pure VHDL and trusted placer and router to do their job. They obviously did not trust them enough and number of times this design is reused justified some manual fiddling.

That is basically the same as OP's problem with PCB design - really good results still require a lot of manual input whether it is PCB or VHDL.
Alex
 

Offline John Coloccia

  • Super Contributor
  • ***
  • Posts: 1208
  • Country: us
Re: current PCB design best practices are stone-age
« Reply #23 on: May 27, 2016, 02:14:09 am »
You're configuring a specific piece of hardware to make a processor. Do you expect that the best processor you can make comes from randomly scattering stuff all over the place? Is that how Intel makes a processor?
If this is directed at me, then no, I don't expect this.

But they could have easily created it in pure VHDL and trusted placer and router to do their job. They obviously did not trust them enough and number of times this design is reused justified some manual fiddling.

That is basically the same as OP's problem with PCB design - really good results still require a lot of manual input whether it is PCB or VHDL.

But that's the point. How can the tool do the best job when it doesn't understand the design intent? How could it possibly know the most efficient way of laying everything out? How could it possibly know how the APIs are written?  Or how all of the various pieces and parts that are configurable are configured? Everyone says that they want to just give it some rules to explain the intent and it should just go do it. Isn't that exactly what they did?

 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11228
  • Country: us
    • Personal site
Re: current PCB design best practices are stone-age
« Reply #24 on: May 27, 2016, 05:37:27 am »
How can the tool do the best job when it doesn't understand the design intent?
That's my point exactly. And that's the reason why there will never be a good auto-placer and even auto-router.
Alex
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf