Author Topic: Lattice iCE40 Bitstream Reverse-Engineered  (Read 16486 times)

0 Members and 1 Guest are viewing this topic.

Offline Bassman59

  • Frequent Contributor
  • **
  • Posts: 643
  • Country: us
  • Yes, I do this for a living
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #75 on: June 05, 2015, 06:10:56 am »
But the constraints file handles pinouts and part type variants! I don't get what the problem is.
The problem is that there is no visibility between the constraints file and the HDL

Because the HDL is concerned with the logic description, from a functional point of view. The constraints file controls the implementation.

From a logic perspective, pin assignment is irrelevant, as is pin drive strength, input termination, I/O supply voltage, I/O standard and even clock frequency. Putting all of that stuff into the HDL just clutters the files with stuff that makes the design difficult to port.

Really, I don't see what's so difficult about creating a UCF (if you're a Xilinx non-series-7 user) file for your constraints and being done with it.

Quote
and no way to select a group of constraints (pins in particular) based on a single build option. And no way to select or access the device type.

You could use a makefile with the command-line xflow, where you specify all of that stuff.
 
We think that you're making this a lot harder than necessary. It seems to me that you've got one basic hardware design that you use everywhere. As it turns out, most FPGA designs aren't like that. Certainly many logic blocks get reused, but most designs are different enough where that it works best to use a source-code-control system to pull in those resuable logic blocks and the constraints file is created for each specific project.
 

Offline mikeselectricstuff

  • Super Contributor
  • ***
  • Posts: 10931
  • Country: gb
    • Mike's Electric Stuff
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #76 on: June 05, 2015, 06:54:00 am »
Quote
Really, I don't see what's so difficult about creating a UCF (if you're a Xilinx non-series-7 user) file for your constraints and being done with it.
That is not the problem. The problem is selecting between multiple sets of constraints for one design.
 
 
Quote
We think that you're making this a lot harder than necessary. It seems to me that you've got one basic hardware design that you use everywhere. As it turns out, most FPGA designs aren't like that. Certainly many logic blocks get reused, but most designs are different enough where that it works best to use a source-code-control system to pull in those resuable logic blocks and the constraints file is created for each specific project.
What is making things difficult is that FPGA tools use different conventions to software tools, for no other reason than history.
The decreasing cost of FPGAs means it is now routine to use them to augment microcontroler systems, which means the tools would be more productive if similar constructs were available.
Learning tools can be a major part of development time, and where the required functionality isn't that complex,  it can be disproportionate to the design effort.

In terms of how they are designed-in and used, FPGAs really aren't any different to MCUs at the topmost level - they're both chips that need code.
Arbitary and unnecessary differences in how that code is created just get in the way of getting the job done.

I have no illusions that we'll see any improvement any time soon, either from manufacturer or OSS tools. It's just that I get annoyed when things are more difficult than they need to be to get the job done.

Like I said, I only know enough VHDL to get by, as I only use it occasionally, and even then have to look at old designs to remind myself how to do stuff.
With my matrix  LED driver design (which is maybe 50 lines total BTW) , it took about 2 days learning and experimenting to get it paramarised to a useable level to configure display resolutions and data formats, and I have 2 project files for the 2 FPGA types, an copy the VHDL between them whenever I update it, because that's quicker and simpler for that particular situation. Learning how to dick around with makefiles and TCL would not have been a good investment in my time.

If the FPGA tools had, for example,  the same preprocessor functionality as C it would have taken maybe a couple of hours.
On the flipside I now know a lot more VHDL, but I'll probably have forgoten it by the time I do a new FPGA project.    project that happens to have an FPGA hanging on the side of it.

This thread was about how FPGA tools could be made better, albeit via OSS.
All I'm saying is that there are some simple things that could be done to existing tools to make FPGA development more accessible to people familiar with software design flows.


« Last Edit: June 05, 2015, 06:57:13 am by mikeselectricstuff »
Youtube channel:Taking wierd stuff apart. Very apart.
Mike's Electric Stuff: High voltage, vintage electronics etc.
Day Job: Mostly LEDs
 

Offline hamster_nz

  • Super Contributor
  • ***
  • Posts: 1394
  • Country: nz
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #77 on: June 05, 2015, 08:30:40 am »
I'm about to have a bit of a rant (not a nasty rant, just a brain dump), so please excuse me....

What is making things difficult is that FPGA tools use different conventions to software tools, for no other reason than history.
...
In terms of how they are designed-in and used, FPGAs really aren't any different to MCUs at the topmost level - they're both chips that need code.
...

But HDL design isn't like software - it might look like software but it isn't A lot of the abstractions that S/W gives you have gone. Lets take the first thing I was told in programming 101 about 30 years ago....

Quote
Computers can do three things:

* Sequential execution of sets of instructions
* Conditional execution of sets of instructions
* Iterative execution of sets of instructions (a.k.a. loops)

... and as programmers it is our job to tell the computer what instructions are needed to complete the task. Our example for the day was "instructions for an alien on how to make a cup of tea".

Working on FPGAs in a HDL looks the same (code in a text editor), but it is very different - no longer have:

* sequential execution (as everything occurs in parallel all over the chip)

* the nature of conditional execution changes, because you no longer are executing a series of statements, but configuring a chain of muxes and digital logic.

* Loops do not really exist, unless they are bounded at compile time.

* no fixed data types exist, apart from that binary bits can be treated as numbers

* no dynamic resource allocation is possible - you can't just "malloc" in 64k of SRAM into a design at runtime.

That is a lot of abstractions to loose and on top of that we pick up new problems:

* Timing closure - is everything simple enough to complete in the tick of a clock

* Clock domains and clock domain crossings are just annoying

* Interfacing with FPGA resources

* Fighting against resource limitations - we only have so much chip to use

You can write the instructions for an alien to make a cup of tea in HDL, but it is for an alien with no short term memory. It looks completely different, involving an finite state machine, watching a clock and remembering lots of state information.

And this is why learning HDL is so painful. You hear people say "I've had 20 years of programming, but I find FPGAs hard". This is because

(a) it is complex - implementing an non-trivial high speed digital design is hard work

(b) it is like somebody saying "I've got a degree in literature and used a word processor to write 20 books, yet I find writing a working Arduino program is hard". Of course it is. Although it looks the same it isn't what you are skilled in.

A course in digital logic is more appropriate as a grounding in low-level FPGA work than a course in programming. If you have ever used TTL logic to build a video card, then you would say "My, this FPGA stuff is a walk in the park, it's so fast and flexible, and turnaround time is a few minutes".

With the foundations being so different, it follows that the tools will be different - for example, that is why there is no 'gdb' for VHDL - it is the wrong tool for the job.





« Last Edit: June 05, 2015, 08:32:11 am by hamster_nz »
 

Offline mikeselectricstuff

  • Super Contributor
  • ***
  • Posts: 10931
  • Country: gb
    • Mike's Electric Stuff
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #78 on: June 05, 2015, 08:48:54 am »
I'm about to have a bit of a rant (not a nasty rant, just a brain dump), so please excuse me....

What is making things difficult is that FPGA tools use different conventions to software tools, for no other reason than history.
...
In terms of how they are designed-in and used, FPGAs really aren't any different to MCUs at the topmost level - they're both chips that need code.
...

But HDL design isn't like software - it might look like software but it isn't
I wasn't suggesting otherwise. It has nothing to do with the semantics of the language itself, but the process of managing the deployment of that code to actual devices.
Whilst the meaning of what's written, and the design process is different, the way that coding process fits into the product development process is basically the same - you write code to make the chip do what you want. You use a tool to turn your code into something that the chip can understand.
Like I said, FPGAs and MCUs are both chips that need code.
That code is part of the product. There is no reason that the ways of managing that code at the "preprocessor" and "project" level should be any - it's just an accident of history, but it makes things harder, less consistent and less easy to understand.
Maybe if ADA had won out over C it wouldn't have been so bad...




Youtube channel:Taking wierd stuff apart. Very apart.
Mike's Electric Stuff: High voltage, vintage electronics etc.
Day Job: Mostly LEDs
 

Offline Muxr

  • Super Contributor
  • ***
  • Posts: 1293
  • Country: us
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #79 on: June 05, 2015, 12:48:45 pm »
To hamster_nz point. I think programmers in general have a hard time with low level programming, since most development happens at a higher level of the stack, and you're encouraged to reuse or leverage other people's low level tools and libs to save time and keep the project scope focused.

But I don't think an HDL is particularly hard. It kind of reminds me of using a coroutine or a green thread framework in networking, except you're very restricted. But then again I have worked with developers who struggle with multithreaded programming. What has helped me I think is that I both have Assembly programming experience as well as writing multithreaded code is what I do at my day job.

I've certainly had a harder time grasping some higher level languages (Haskell) or concepts. At least in an HDL once you get the basic building blocks you can establish a solid fundamental understanding of how everything else is built, because it has to adhere to the basic principles of digital logic which is set in stone (0,1,x,z was a bit weird though), whereas higher level paradigms in your standard programming languages are laced with syntactic sugar and magic that often make little sense.

For instance after knowing Perl and using it, I still come across code I don't understand, whereas only after a few days of learning Verilog I can read other people's code with a fair amount of confidence that I understand what's going on.

I don't think there are as many people who write an HDL or if there are it's not as represented as your [regular] software development. Perhaps because most of it is proprietary (ASIC design) and it's not as open. So that I think is what makes it tougher to get into it since there is less resources to learn from out there.
« Last Edit: June 05, 2015, 02:47:36 pm by Muxr »
 

Offline Bassman59

  • Frequent Contributor
  • **
  • Posts: 643
  • Country: us
  • Yes, I do this for a living
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #80 on: June 06, 2015, 01:47:47 am »
But I don't think an HDL is particularly hard.

HDLs are not at all hard if your background is in digital logic design.

Quote
At least in an HDL once you get the basic building blocks you can establish a solid fundamental understanding of how everything else is built, because it has to adhere to the basic principles of digital logic which is set in stone (0,1,x,z was a bit weird though)

0, 1, X, Z are not weird -- at its base, binary logic has only only two possible states (you can guess what they are). You need X to model contention or some other unknown state, and you need Z to model the high-Z driver (tri-state) condition.

Quote
whereas higher level paradigms in your standard programming languages are laced with syntactic sugar and magic that often make little sense.

They make perfect sense if you are versed in the language and its paradigms. At first glance, Objective-C looks "weird" to the K&R C programmer, but after spending some time understanding it, you'll see that it makes sense.

Quote
I don't think there are as many people who write an HDL or if there are it's not as represented as your [regular] software development.

I'm not sure whether you mean "who write an HDL" as a) Engineers who use an HDL to describe logic for implementation in an FPGA or ASIC (or for simulation and verification, which is necessary), or b) tools designers who write synthesis and simulation tools for those HDLs.

But in either case, there are a lot more people writing software for all of the various processors than there are people designing FPGAs or ASICs.
 

Offline Muxr

  • Super Contributor
  • ***
  • Posts: 1293
  • Country: us
Re: Lattice iCE40 Bitstream Reverse-Engineered
« Reply #81 on: June 06, 2015, 02:18:05 am »
(0, 1, X, Z) was weird for me at first since my background is mainly software design. I mean there is nothing hard about the states, they make perfect sense, what wasn't as easy was remembering the logic operation outcomes for X,Z that aren't always obvious to me. You probably don't have issues with it since it's what you do on a daily basis. Not difficult though I agree, just a bit weird for someone with my background.

Objective-C was pretty straight forward for me, people who aren't used to managing memory probably get hung up on the reference counting, but I didn't have any issues with it.

What I was referring to is higher level patterns which might not be obvious for someone not familiar with the pattern, like the first time you come across inversion of control in a large project (it may not be so obvious), or the first time you use a functional language like F or Haskell. But mainly I am talking about things people do in languages like Perl or Ruby... Where the language itself can be introspected and modified to implement new language features, I mean in Ruby you can completely change how string behaves and I know libraries who do just that, this all can make things really hard to understand. You can change the entire language which is why DSLs are common in Ruby.

Or languages which pride themselves on having a lot of syntactic sugar.

For instance this is valid perl, and I know programmers who get a kick out of writing clever code nobody can understand:

Code: [Select]
''=~('(?{'.(']])@+}'^'-/@._]').'"'.('/<[*-_<+>?}{>]@}[email protected]}]])@+}@<[*-_<+>?}{>]@^'^'`^=_^<]_[[]+[/,]_/]-/@._]/^=_^<]_[[]+[/,|').',$/})')It prints a message.

These features are what make Perl powerful, but it's also the reason why the code can be a nightmare to understand.
« Last Edit: June 06, 2015, 02:36:47 am by Muxr »
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf