Quilter, which has raised more than $40 million from investors including Benchmark, Index Ventures, and Coatue, used its physics-driven AI to automate the design of a two-board computer system that booted successfully on its first attempt, requiring no costly revisions. The project, internally dubbed “Project Speedrun,” required just 38.5 hours of human labor compared to the 428 hours that professional PCB designers quoted for the same task.

  • A_A@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 hours ago

    Playing games against the laws of physics, so, games against reality. This is similar to how humans develop. So, i.m.o., this approach will go way beyond fabricating computer boards.

  • wjrii@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    7 hours ago

    “Language models don’t apply to us because this is not a language problem,” Nesterenko explained. “If you ask it to actually create a blueprint, it has no training data for that. It has no context for that…” Instead, Quilter built what Nesterenko describes as a “game” where the AI agent makes sequential decisions — place this component here, route this trace there — and receives feedback based on whether the resulting design satisfies electromagnetic, thermal, and manufacturing constraints… The approach mirrors DeepMind’s progression with its Go-playing systems.

    This is kind of interesting and cool, and it’s not a hallucinating LLM. I’ve designed a couple of simple circuit boards, and running traces can be sort of zen, but it is tedious and would be maddening as a job, so I can only imagine what the process must be like on complex projects from scratch. Definitely some hype levels coming from the company that give me pause, but it seems like an actual useful task for a machine learning algorithm.

    • jacksilver@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 hours ago

      I was going to ask how this is different than a Reinforcement Learning algorithm but then they called out Deep Minds Alpha-Go

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      7 hours ago

      as someone who used to work on “expert models” i’m excited that not everyone has abandoned them for “what if we just had a model that knows everything (that doesn’t exist) and costs a billion dollars to run”

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      7 hours ago

      Yeah…

      But you know how people are already comparing vibe coding to 40k where “priests” pray to computers and hope if they do the exact same thing they’ll get the same result they want?

      If we start walking down this road of even the chat or not understanding why what it did was better…

      Serious unintended consequences are going to be inevitable.

      Like, I swear nobody knows the paperclip story anymore.

      Instrumental convergence posits that an intelligent agent with seemingly harmless but unbounded goals can act in surprisingly harmful ways. For example, a sufficiently intelligent program with the sole, unconstrained goal of solving a complex mathematics problem like the Riemann hypothesis could attempt to turn the Earth (and in principle other celestial bodies) into additional computing infrastructure to succeed in its calculations.[2]

      https://en.wikipedia.org/wiki/Instrumental_convergence

      I mean, we can make a very very solid argument that much of our current problems are caused by high level stock trading being done by algorithms who’s only instruction is “make numbers go up”.

      This shit aint even hypothetical anymore, it’s just instead of “make as many paperclips” we told it “make more money than you did yesterday”.

      Which is why we’re burning down the planet to make billionaires even more money

  • MountingSuspicion@reddthat.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    7 hours ago

    I may be hallucinating now, but I swear I remember nearly a decade ago there was a paper or articles about how CG PCBs were using some electrical tricks that were non standard to minimize space or something. The design purposefully had arcs or short circuits or something. Maybe it was a temperature thing? I did a more than cursory search and couldn’t find much, but I vividly remember having conversations about it. Anyone remember anything like that?

    • GreyEyedGhost@piefed.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      14 minutes ago

      There was a story about a researcher using evolving algorithms to build more efficient systems on FPGAs. One of the weird shortcuts was some system that normally used a clock circuit, but none was available, and it made a dead-end circuit the would give a electric pulse when used, giving it a makeshift clock circuit. The big problem was that better efficiency often used quirks of the specific board, and his next step was to start testing the results on multiple FPGAs and using the overall fitness to get past that quirk/shortcut.

      Pretty sure this was before 2010. Found a possible link from 2001.

      • MountingSuspicion@reddthat.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 hours ago

        Yea. I didn’t call it AI because I’m not sure the exact method of generation. It may have been AI or maybe some other generation method.

  • unmagical@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    7 hours ago

    I can’t wait for hardware companies to let go of their designers prematurely in the pursuit of AI everything only for there to be a bug in a major board and no one available to troubleshoot thereby stranding customers with a broken board, no revision on the horizon, and no recourse.

  • _cryptagion [he/him]@anarchist.nexus
    link
    fedilink
    English
    arrow-up
    12
    ·
    7 hours ago

    That middle step — the layout — creates a persistent bottleneck. For a board of moderate complexity, the process typically consumes four to eight weeks. For sophisticated systems like computers or automotive electronics, timelines stretch to three months or longer.

    imagine being the poor soul who connects circuits together in some CAD program for eight weeks straight. I figure I would have pulled all my hair out by the end of the first week.