According to John Hart SM ’02, PhD ’06, who heads MIT’s Department of Mechanical Engineering, the processes of making everyday objects are being dramatically enhanced by the introduction of digital technologies—and the possibilities for the future are boundless.
MIT, Hart says, has an outsized role to play in defining the next generation of manufacturing, leading the way through research, education, and entrepreneurship. A new effort, Manufacturing@MIT, which Hart codirects with Suzanne Berger, Institute Professor of Political Science, will develop new scalable and sustainable production processes and study how new technologies are adopted and how organizations can improve their factories while creating good jobs.
In the lab, Hart’s research involves a melding of centuries-old manufacturing know-how with more recent technologies like 3-D printing and robotics, and cutting-edge computing techniques, including AI. Take the jet engine, a machine that must operate safely and efficiently at high temperatures. What should its critical components be made from? In many cases, we’ve reached the performance limits of traditional materials, which limit the performance of sophisticated technologies.
“We need to figure out how to build components that combine the best of multiple materials in one,” says Hart. With today’s 3-D metal printers, “it’s only possible to print one alloy at a time.” Hart, however, developed a new kind of 3-D printer that varies the composition of the material voxel by voxel, throughout the printed object. (A voxel is like a pixel but in three dimensions instead of two.)
First, an inkjet head lays down a set of special inks, each with a different mix of metal or ceramic nanoparticles. A fine metallic powder is then deposited. Then, a laser melts the powder and mixes the ink, and this process repeats for each layer, ultimately forming the final object.
For a jet engine component, this process will allow Hart to print an outer layer that’s resistant to oxidation, an interior that can handle high stress, and a gradient between them. The technique blends the materials together, like a bone that smoothly transitions between its harder and softer portions. In fact, Hart says this technology could be used to create better medical implants like artificial hips or vertebrae that integrate more favorably within the body. The vision for the next wave of 3-D printing, he says, “is the ability to specify the material properties anywhere within an object. And we need to develop the design and computation tools to usher in a new era of product innovation.”
AI to navigate a world of design choices
“Can AI come up with a product we’ve never thought about?” asks Faez Ahmed, the American Bureau of Shipping Career Development Professor of mechanical engineering, who leads MIT’s Design Computation and Digital Engineering lab.“Can it get a patent?”
Ahmed is trying to create generative AI algorithms to do just that. He considers a bicycle. When it comes to manufacturing, he says, “most people just take the default bike and make small modifications.” How does one navigate such a vast design space to make something new?
Ahmed’s answer is an “AI copilot.” He likens it to J.A.R.V.I.S., the AI assistant in the Ironman movies—a technology that could help brainstorm ideas, field questions about the production process, and show how to improve on everything that’s come before.
It’s a system that simulates real-world physics, incorporates manufacturing constraints, and draws on thousands of actual products to respond to requests such as: “I want a futuristic cyberpunk-style road racer.” The results are ready-to-be-built designs that no one has seen before.
Ahmed is excited by the potential of tools like this one to democratize design. “There are people across the world with good ideas,” he says, “but they’re not necessarily trained in engineering software.” His hope is that his tools may one day reduce the barrier of entry to get people of all backgrounds started. The goal, he says, is “to augment human potential, not replace it.”
More agile robots on the factory floor
Julie Shah ’04, SM ’06, PhD ’11, the H. N. Slater Professor in Aeronautics and Astronautics, adopts a similar philosophy in her work on robotics.
The traditional difference between people and robots on a factory floor is their flexibility, according to Shah, who leads the Interactive Robotics Group in the Computer Science and Artificial Intelligence Lab. Coordination of human tasks happens organically, with the reassigning of tasks as needed. But introduce a robot, and that portion of the manufacturing process tends to become more rigid. Shah also serves as faculty director of the MIT Industrial Performance Center and co-leader of its Work of the Future Initiative, which carries out an applied research and educational program to understand how organizations make new technologies work in practice.
“If you want to know if a robot can do a task,” she says, “you have to ask yourself if you can do it with oven mitts on.” But so much of the work we do in our lives, including in manufacturing, requires dexterity and adaptability, and getting a robot to help with nimbler tasks is costly. According to Shah, it’s one of the reasons that only 1 in 10 manufacturers in the United States has a robot, and why those who have them don’t tend to use them much.
Shah is trying to change that by designing robots and automation systems that are safe, smart, and flexible. Done properly, she says, human-robot integration in some cases has led to better-paying jobs, workers learning new skills, and higher profits and productivity.
Part of her work involves programming robots to model how people move and operate so they can better integrate into an industrial environment alongside their human companions. Crucially, she’s also designing robots that workers on the factory floor can program and test without needing special expertise. That means affordable, more agile machines (think no oven mitts) that are just as easy to teach and train as a human being—“a system that you can program with low-code or no-code interfaces,” she says.
Shah conjures an example of someone wearing a mixed-reality headset while they walk and talk through the steps of an industrial process. That visual and verbal information would be automatically compiled into a control program “and then the robot just does it,” she says. The goal is to put the front line workers into the driver’s seat for testing, deploying, and re-programming robots as the work changes. “That’s the dream of the future.”