" chip design cycle is also probably one of the most complicated engineering processes that exists in the world,” said Institute Professor Siddharth Garg (ECE). “There’s a saying that rocket science is hard, but chip design is harder.” "
Why not Rockets for the rest of us first, if that's easier?
Rocket science is hard because you have to burn your own cash if you are not connected. Chip design is less risky for the individual, but it's been harder (so far) to signal your mastery to the funders.
> To address this challenge, Garg and colleagues scoured Verilog code on GitHub and excerpted content from 70 Verilog textbooks to amass the largest AI training dataset of Verilog ever assembled. The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.
I expect this will become the norm in a number of fields. Perhaps COBOL is next?
> Consequently, the NYU researchers’ goal is to make chip design more accessible, so nonengineers, whatever their background can create their own custom-made chips.
I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.
A lot of my questions went away when I got to this line though:
> He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.
This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.
Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable
> I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.
That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).
The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.
Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.
Would you really earn more money doing this than monopolizing online search advertising? Because I find that hard to believe. Hardware seems like a miserable business.
We have textual slop, visual slop, audio slop, so we asked: "What else do we want to sloppify?". And then it dawned on me. ICs. ICs haven't been slopped yet — sure, we could ask the machine to generate some vhdl, but that isn't the same. So we present: Silicon Slop.
I am actually astonished. Is this what happens when the NYU board of directors tells every department they have to use and create AI, or they will stop funding? What is going on?
" chip design cycle is also probably one of the most complicated engineering processes that exists in the world,” said Institute Professor Siddharth Garg (ECE). “There’s a saying that rocket science is hard, but chip design is harder.” "
Why not Rockets for the rest of us first, if that's easier?
Rocket science is hard because you have to burn your own cash if you are not connected. Chip design is less risky for the individual, but it's been harder (so far) to signal your mastery to the funders.
The difficulty is not (entirely) technical
There’s already lots of rockets for the rest of us, they’re just not as big
Link to the BASICS course mentioned: https://engineering.nyu.edu/academics/programs/digital-learn...
Link to the Zero to ASIC course that they are collaborating with: https://www.zerotoasiccourse.com/digital/
I wish for free alternatives to these.
uni PR wasn't bad faith, just bad placement. source is here
https://github.com/shailja-thakur/VGen
Earlier from the NYU (2023)
https://zenodo.org/records/7953725
Related (?) blog post (2023)
https://01001000.xyz/2023-12-21-ChatGPT-AI-Silicon/
> To address this challenge, Garg and colleagues scoured Verilog code on GitHub and excerpted content from 70 Verilog textbooks to amass the largest AI training dataset of Verilog ever assembled. The team then created VeriGen, the first specialized AI model trained solely to generate Verilog code.
I expect this will become the norm in a number of fields. Perhaps COBOL is next?
> Consequently, the NYU researchers’ goal is to make chip design more accessible, so nonengineers, whatever their background can create their own custom-made chips.
What?
"ChatGPT: Please design a chip for me."
Basically.
I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.
A lot of my questions went away when I got to this line though:
> He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.
This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.
Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable
All I remember from my experience with VHDL/Verilog is that they really truly suck.
> I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.
That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).
The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.
Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.
Would you really earn more money doing this than monopolizing online search advertising? Because I find that hard to believe. Hardware seems like a miserable business.
That might change if geopolitical tensions fragment the global supply chains.
Being a fab is a garbage business.
Being a software supplier to fabless semiconductor companies is a very profitable business.
In the Gold Rush, the people who came out rich were selling the shovels and denim.
We have textual slop, visual slop, audio slop, so we asked: "What else do we want to sloppify?". And then it dawned on me. ICs. ICs haven't been slopped yet — sure, we could ask the machine to generate some vhdl, but that isn't the same. So we present: Silicon Slop.
I am actually astonished. Is this what happens when the NYU board of directors tells every department they have to use and create AI, or they will stop funding? What is going on?
Ah, thanks; we definitely needed more artisanal, real human social media slop like this.
Improving the lived experience keeping it real! Feels so much more authentic.
More people would love AI if it communicated like an emo *Nix elitist. Train it on Daria, Eeyore, and grunge lyrics! People will love it!
Bootstrap framework for chips, Verilog stolen from books and from GitHub.