Geez.. you are in the hands of GPU makers which shrunk the market to 2 options in the last decades...
What is good about that? Rant?
give me a break about this nonsense rant thing
Paul
Like any other market, there is a cost to enter. Apparently, other manufacturers and potential manufacturers don't see a way to enter the market and make a profit. If it was easy, we would be knee deep in competitive graphics boards and programming them would take on an Arduino feel.
NVIDIA, OTOH, has invested a HUGE amount of money developing the technology and backing it up with free tools. Just the cost of feeding the researchers, engineers, mathematicians and technical writers must be costing them a TON of money. From their stock price, I have to assume the market thinks they're doing excellent work.
Don't get me wrong, I have always hated the NVIDIA card in my Linux machine. When I bought it nearly 20 years ago, I had to rebuild the drivers every time the kernel changed and it wasn't a pleasant prospect. Closed source drivers kept me away from NVIDIA for a long time.
Things change, I want to play with massively parallel computing. Where else can I buy an 8192 FP core processor? What other company has all of the bits and pieces to make it possible for a hobbyist to work on such things. And their community forum is excellent. Couple the hardware with Julia or MATLAB and even the elderly can play in the big leagues.
We're talking multiple teraflops of compute power in a laptop. We went to the Moon with machines performing, at best, 3 megaflops in a CDC 6600 mainframe. I can have more compute power in my workshop than existed on the entire planet in 1969. Without the huge electric bill!
Anybody want to compete? Well, step right up! Go hire a raft of the smartest people on the planet and give it a shot! We'll know from the stock price whether they are making money.
To bring this back to Julia, it certainly looks like the language has all of the libraries and such to take advantage of NVIDIAs offerings plus a way to adapt to newcomers. I don't know that Julia will catch on like Python but it certainly has possibilities.
The Julia code for digit recognition over the MNIST dataset is readily available:
https://github.com/crhota/Handwritten-Digit-Recognition-using-MNIST-dataset-and-Julia-Flux/blob/master/src/Handwriting%20Recognition.ipynbThere is a ton of comments and the code is pretty easy to follow. I may eventually start to like the Jupyter Notebook approach.
In terms of Julia and Flux, here's a gentle introduction:
https://fluxml.ai/tutorials/2020/09/15/deep-learning-flux.html