

Are you seriously asking for sources for things that HAVE NOT BEEN DONE YET, that’s what you’re asking for here? 🤡


The point is pretty clear, the EU is not a genuine democracy in any meaningful sense, It’s a neoliberal project run by the most corrupt people imaginable.


I love how you just keep repeating the same thing over and over. Your whole argument is that we need some amazing breakthrough to make other materials viable, but the reality is that it’s just a matter of investment over time. That’s it. China is investing into development of new substrates at state level, and that’s effectively unlimited funding. The capitalist economic arguments don’t apply here. If you think they won’t be able to figure this out then prepare to be very surprised in the near future.


Oh right, the famous laws of physics that apparently decree silicon must forever be the cheapest material. Let me check my physics textbook real quick. Yep, still says nothing about global supply chains and sixty years of trillion-dollar investment being a fundamental force of nature.
Silicon is cheap because we made it cheap. We built the entire modern world around it. We constructed factories so complex and expensive they become national infrastructure projects. We perfected processes over many decades. That’s not physics, that’s just industrial inertia on a planetary scale.
To claim nothing else could ever compete requires ignoring how technological progress actually works. Remember when aluminum was a precious metal for royalty? Then we figured out how to produce it at scale and now we make soda cans out of it. Solar panels, lithium batteries, and fiber optics were all once exotic and prohibitively expensive until they weren’t.
As you yourself pointed out, germanium was literally the first transistor material. We moved to silicon because its oxide was more convenient for the fabrication tricks we were developing at the time, not because of some cosmic price tag. If we had poured the same obsessive investment into germanium or gallium arsenide, we’d be having this same smug conversation about them instead.
Similarly, graphene isn’t too expensive because physics. It’s too expensive because we’re still learning how to make it in bulk with high quality. Give it a fraction of the focus and funding that silicon has enjoyed and watch the cost curve do the same dramatic dive. The inherent cost argument always melts away when the manufacturing muscle shows up.
The only real law at play here is the law of economies of scale. Silicon doesn’t have a magical property that makes it uniquely cheap. It just has a sixty-year head start in the world’s most aggressive scaling campaign. If and when we decide to get serious about another material, your physical laws will look a lot more like a temporary price tag.


Proof and sources for what specifically?


I’ve already explained the dynamic numerous times in this very thread.


Yeah, Linux makes macs a lot more appealing.


I’m beginning to get the impression you don’t actually understand what the term economics of scale means.


What I keep explaining to you here is that silicon is not inevitable, and that it’s obviously possible to make other substrates work and bring costs down. I’ve also explained to you why it makes no business sense for companies already invested in silicon to do that. The reason China has a big incentive is because they don’t currently have the ability to make top end chips. So, they can do moonshot projects at state level, and if one of them succeeds then they can leapfrog a whole generation of tech that way.
You just keep repeating that silicon is the best material for the job without substantiating that in any way. Your whole argument is tautological, amounting to saying that silicon is widely used and therefore it’s the best fit.


Again, silicon was the first one that people figured out how to mass produce. Just because it was cheaper, doesn’t mean that a new material put into mass production won’t get cheaper. Look at the history of literally any technology that became popular, and you’ll see this to be the case.


If you look at the price of silicon chips from their inception to now, you can see how how much it’s come down. If a new material starts being used, the exact same thing will happen. Silicon was the first substrate people figured out how to use to make transistors, and it continued to be used because it was cheaper to improve the existing process than to invent a new one from scratch. Now that we’re hitting physical limits of what you can do with the material, the logic is changing. A chip that can run an order of magnitude faster will also use less power. These are both incredibly desirable properties in the age of AI data centres and mobile devices.


The cost invariably goes down as production of any new technology ramps up though.


My heart bleeds for them.


It’s only a matter of time until somebody figures out how to mass produce a computing substrate that will make silicon look like vacuum tubes. We don’t need to discover any new physics here. Numerous substrates have been shown to outperform silicon by at least an order of magnitude in the lab. This is simply a matter of allocating resources in a sustained fashioned towards scaling these proofs of concept into mass production, something planned economies happen to excel at.


A lot of it stems from the fact that Chinese education system is brutally competitive, and rich parents can’t buy their kids way into university. So, instead, they often just send their kids to prestigious western universities.


The secret sauce here is how the model was trained. Typically, coding models are trained on static snapshots of code from GitHub and other public sources. They basically learn what good code looks like at a single point in time. IQuest did something totally different. They trained their model using entire commit history of repositories.
This approach added a temporal component to training, allowing the model to learn how code actually changes from one commit to the next. It saw how entire projects evolve over months and even years. It learned the patterns in how developers refactor and improve code, and the real world workflows of how software gets built. Instead of just learning what good code looks like, it learned how code evolves.
Coding is inherently an iterative process where you make an attempt at a solution, and then iterate on it. As you gain a deeper understanding of the problem, you end up building on top of existing patterns and evolving the codebase over time. IQuest model gets how that works because it was trained on that entire process.


Aww look at you still malding your color revolution failed.
Can you tell me what sources you two are asking for? My argument is that economies of scale make new technologies cheaper over time because industrial processes become refined, people learn better and cheaper ways to produce things, and scaling up production brings the cost down. What are you asking me to source here specifically?