Frontier Materials + AI + Future + Cities

How can we use mixed reality, design architecture and Frontier Materials to build better cities? Watch this exciting discussion on using AI to connect the digital and real world.
February 11, 2022

Two experts have a focused discussion about how future cities will look increasingly different from what society is used to seeing or experiencing. Due to the introduction of sophisticated artificial intelligence and 3D designing, future cities are being designed with sustainability, mobility, and the human experience as the core of the design, which has meant that many new cities have innovative features that seek to improve the way the city operates. Technology and science has enabled engineers and urban designers to approach future settlements with a keen eye for sustainability, smartness, and the use or reuse of frontier materials.

Click on the toggle above for the full transcript.

About PUZZLE X™:

PUZZLE X 2021 | Nov 16-18 is the world's first collision grounds for science, business, venture and societal impact. It brings Frontier Materials to the forefront to aid the Sustainable Development Goals set out by the United Nations by 2030.

Click on the toggle above for the full transcript.

View PUZZLE X 2021 program here.

Want to be a part of PUZZLE X? Register your interest here.

Design & Art

AI & MATTERverse

Smart Cities

Future

Construction & Infrastructure

Machine Learning

What SDG is this related to?

MATTERverse Activity

Mardis Bagley  0:01  

Can you hear me? Excellent. Life is so much better with one of these. Well, cool. Wow, what a day. So far we've seen 3D printed meat, right, not really meat actually plant based meat. We've seen a Nobel Prize winner for graphene, and some other amazing things. So thank you for sticking around for Greg and I, my name is Mardis Bagley. I am a creative director and founding partner of NoNfiction. NoNfiction is a design studio in San Francisco, California. And we like to say that while we design hardware products, it's really more about turning science fiction into reality for a better future. And this is what we do. We're in San Francisco, kind of that hub of innovation, one of many in the world. And we try to harness that power and use it to make an impact on the world. So that's me, I'll pass it over to Greg, for an intro.


Greg Demchak  1:08  

Yeah, thanks. My name is Greg Demchak. I'm with Bentley Systems, I run an innovation lab for a global software company, where I currently look at everything from mixed reality to digital twins, to how cities architects, engineers, and builders, use digital tools to build better cities and have better experiences with the built environment. I'm an architect by background, but I've been developing software solutions for that space for about 20 years now. So we're going to together go on a little bit of a journey, we're coming sort of outside of the space of material science directly, and trying to give us sort of a look at the future of materials from the perspective of a creative technologist and a software designer.


Mardis Bagley  1:56  

Thank you, Greg. Well, again, perfect intro, we are outside of the world of materials, but we interact with them all the time. So Nonfiction, as I said, we turn science fiction into reality for a better future. I've been doing industrial design along with my studio for many years, we've put a lot of products out into the ecosystem, we've sold a lot of amazing products. But that made me and my company step back and look at what we're doing. We're putting lots of things into landfills. So early on, we started deciding to do something different, we started to align our company with the United Nations Sustainable Development Goals. So every client we work with, we're a consultancy, every client we work with must satisfy at least one of those sustainable development goals. So eliminating poverty, bringing education, fighting, climate change, and so on. This is really at the core of what we do. And surprisingly, there aren't a whole lot of studios that really focus on that. So at our studio, we do industrial design, design engineering, experience design strategy, many more things, but also space architecture. We've designed things that go up into space, my partner is also an outer space architect trained in Houston, Texas, one of the only programs that is actually funded by NASA sponsored, so we try to live in that field a lot. 


Mardis Bagley  3:27

But today, we're going to talk about science, technology, art and design. And the real reason is because there's a bunch of brilliant scientists, material scientists, innovators, out there doing things that we can only begin to imagine. But to get that from point A to point B, or even point Z, that's a real challenge. You do need design, you do need well, I'll go ahead and say it here: science, technology, art and design. Science validates magic, really interesting, right? Technology creates those possibilities. Art triggers emotion. And design enables connection. And so this is really the true synthesis of how we get products to market that are very sticky. They're very adaptable, right when you're asking for behavior change from people, so like we have to save the world, you know, everybody has something else to do, everybody else has something else they're more comfortable doing. So if you put these four together, you have magic, possibilities, emotion, and connection that people are willing to buy into or that they're willing to fall in love with and actually move on with new products and new behaviors that hopefully at least in our mind, create impact for future generations. We're also hosts of the West Coast's largest material library sponsored by material connection. And part of why we do that is we want materials in-house and in our studio, we go and we touch the materials, we interact with the materials, we invite our clients and our vendors to come engage with it because it's very important to be very tactile. You can read all the blogs and watch all the YouTube videos that you want about a material, but until you're actually playing with it, it's not quite real. So this is part of what we do, we bring it to the forefront of our process.


Mardis Bagley  5:26

So, over the years, we've designed a lot of products for neuroscientists. Up here in the in the center, this is halo neuroscience. If I asked the average individual where their motor cortex is, they may not know, right, but when you start to put the put a headset headphones on, that has a stimulator applying electricity to your brain so you can move faster, you can learn skills faster as an athlete, this is suddenly a pivotal technology built into a use-case that's extremely familiar. So halo neuroscience is making differences that Olympic athletes are starting to see, you know, professional leagues of football and baseball are starting to see over here on the far right hand side is a Philips smartsleep. Also another product to help the brain find its best sleep. It stimulates and gathers metrics about it. But what's really interesting about that is it's not a hardware product, it's not an injection molded product, it's a soft product, we call it soft goods. And designing for the head and the ergonomics around it is extremely difficult. So how do we do that? We have to measure heads, we have to create a system that goes from 5% to 95% success on different size heads, males and females. 


Mardis Bagley 7:00

When we talk about nanomaterials, what's the next step of that? Maybe we can have a smart material, that instead of designing one headset, or 50 headsets that fits everybody, it evolves its shape and size. It decides, you know, we have enough trouble trying to find a pillow to make a mattress comfortable, what if your headset or even your pillow or mattress could shape to your position? These are the things that we're looking at right now. Over here in the lower right, we've developed a product that is called Calla Health. It's the first time ever FDA approved product that has been released in the market that helps people who suffer from essential tremors. Now this is really important because you don't have to put it on for a year for six months, through three weeks, you can put it on 20 minutes later, you go from not being able to write not be able to use a text based, you know, phone, to actually a grandmother writing a letter to their their grandkids or texting them. It's really transformational. And then finally over here, one of the products we've done is more about a first responders story. It marries augmented reality into the helmet of a firefighter and then it uses thermal mapping to do edge detection in the room around you. So suddenly, a firefighter can walk into a zero visibility fire, smoke everywhere, and see you know where the steps are, where the door is, where somebody might be laying there, go to rescue them, and leave the burning building alive. This is shaving seconds off of their ingress and egress to a burning building. All the while the data is sent up to the cloud, and back to the captain at the fire truck, where he's captured the 3D data and he's suddenly shared it out to everyone else on his team so they all know what the inside of a burning building looks like and they can be safer along the way. These are the fun things that we work with. This is very advanced technology, to be honest, but it's only the tip of the things we're talking about today. Our engineers and our engineering partners over the years of work for Apple designing the thinnest iMac, we've reinvented what it could be like if Apple had that reemergence into the marketplace. We've worked with fully invasive neurospace stimulators for epilepsy. Not an easy thing to do, but definitely very rewarding in the end.


Mardis Bagley  9:43

And then finally, nowadays, we're taking all that knowledge, we've done a lot of medical products, a lot of consumer electronics, we're starting to build them into larger systems. Up here on the on the upper left is is working with Nashville, Tennessee, what does it look like to turn Nashville Tennessee, already a beautiful place into a smart city where we want to integrate, you know, not only electric vehicles, but smart vehicles. Built in AI that understands the patterns of not only traffic, but also people. When an event happens, the patterns change about how transportation moves, suddenly the bus stops change. Because what we've realized over time is, so much of our technology is value engineered for the masses, value engineered to the lowest common denominator. And then what happens, people don't take transportation like they should, we want to elevate that, we want to bring pleasure back into this whole idea of like, you know, public transportation and smart cities. 


Mardis Bagley 10:48

Down here, the other thing we're working with is we're working on a school system for the country of Singapore. And this is really interesting, because Singapore already has a pretty fantastic school system. Why would they reach out to a bunch of designers to do it better? Well, because we're not educators. I've taught at a Professor level off and on throughout my career, but I'm ultimately a designer, because they want to break outside that proverbial box. Did you know that in most schools, the physical architecture is actually designed, they're square boxes and designed by the same architects that design prisons. And that's probably not a surprise, if you've, you know, spent some time in school. Well, we're changing that up, we're creating schools that are actually designed with biomimicry. Or in even biomorphism, we're bringing back organic looking structures. So not only are people more comfortable, you know, bringing things in green and soft, lowers heart rates and improves retention of knowledge. It seems kind of funny, when I'm saying this, it's like, yeah, of course, you know, people love the outdoors, they're calm, but it's not how we've been doing it for years. So we're changing that up and we use parametrics to do that. Right? So we can take our model and drop it in multiple different places, changing up the architecture. And then finally, on the right, why do buildings have to be static? Why can't they breathe? Why can't they respire? Can they, you know, change with the temperature? Can they change with how many people are in the room? So what we started doing with our project in Singapore, this school lifelong learning is what we call it, is we started attaching wearables to kids. So we're gathering biometric data, galvanic skin response, heart rate variability, we're using computer vision to detect moods. And these kinds of things can be pumped back into the physicality of the building, to actually open windows and change temperatures on the fly. So finally, the last thing, as I said, we're kind of all these things come in together into these larger systems. We work a lot in space, you know, trying to think about how, you know, if we're going to use the moon as a launch point for Mars and beyond. What does that look like? Do we want to go up and sit on a gray ball in space? Not fun. So hey, let's integrate some colors. Let's integrate plants, we can have plants that grow in microgravity, we can start to do these things. But again, it's gathering those microbes, those micro data points, to make sure that the architecture is almost living alongside everyone.


Mardis Bagley  13:47

We're happy to say that we just won an award from NASA and the Canadian Space Agency for redesigning what it looks like to create food over long term space travel. So these are all the things that we're up to. Why do I mention all this? Because all that's very high technology, right? There's a lot of really innovative things there. I'm a designer, do I understand it? I speak with all the architects, I speak with all the scientists and I try to figure this out. And quickly, me and my team ingest this information and regurgitate it in a brainstorm of ideas, and we push the boundaries. We know the material designers are fantastic at figuring out solutions, but they tend to go down a single pathway. We go really wide. We diverge in our ideas, and then we converge and we do this over and over again. And what we found is collaborating with a fully balanced team of designers, engineers and artists is the way we are most successful. Okay, I will pass that on to Greg.


Greg Demchak  14:58  

And then we'll get into a conversation here. So as a designer and architect by background, I went through this process of actually having to innovate the field of architecture and going from 2D representation as a way to build a building to a full 3D model where the 3D model is what actually generated the floor plan. I started that journey back in 1998 or 2000. Today, that product, which is called Revit technologies, is used by architects all over the world, but at the time, it was just an idea at the very early stage. And I've sort of progressed in that career building software solutions, trying to then imagine a future state, something that is not available today and imagine how they could scale up. And so what I'm talking about is like some of the more recent work, that’s right at the edge and sort of what could that look like if we start now and what could it be like 20 years from now? Can we have the same kind of innovation that happened with going from 2D to 3D? So the company I work for Bentley Systems, again, it's software development, it's 3D modeling, it's CAD systems, it's the ability to interact with 3D systems to represent the built world. So that's sort of like our playground, how do we interact with the digital world. And now we're looking at this whole idea of the digital twin, and the digital twin of our physical environment, the digital twin of our body, the digital twin of like, anything that we can interact with. 


Greg Demchak  16:23

The truth is now reality can be captured, like in near real time. So no longer are you making a 3D model, and then handing it over, it can actually be rendered in real time. And there's this entire new possibility of systems coming together to create these digital twins, and it involves absolutely everything. Humans, machines, IoT, information systems, yourself will become part of it. 


Greg Demchak  17:08

So I don't know if anyone has seen this film, Big Hero Six, has anyone seen this film from this materials lab? Much less than I thought so I'm not breaking the rules here. But I think it's also interesting to look at cinema and the imagination of filmmakers. So this is a film that came from Disney. And this kid gets up here. And he's imagining how nanobots if given agency, and if you could control it with his mind, like, what could he possibly do? So here, he's got this little robot that he's actually manipulating with mind control. And it will create basically any form that he can imagine in real time. Now, I don't know if this is going to be possible now, or the next 20, 30, 40 years, but the vision of this, I think, is super interesting. And the reason I want to show this, I'll show you the next slide. There's actually research happening right now that I'll show you that actually is already starting to suggest what this type of material could mean. I know we're talking before the session, if you if you have these types of nanomaterials that could just self-form, through thought that these chairs could actually have emerged on the stage for us like right now in real time, and then just sort of disintegrated into a regenerative material cycle.


Mardis Bagley  18:27  

Well, we also know that this ability to move things with the mind is possible, right? We've seen it tested and things like that. But right now, there's a company called Atom Limbs, that is actually taking the brain as well as nerve endpoints to help people move prosthetics with their mind. And this is really far advanced compared to the claws we've seen previously.


Greg Demchak  18:54  

And if we could roll this video, this is actual research done right now, where I think it's interesting, I tried to grab examples that are, this is only 145 views, by the way, this is research being done at MIT. She's actually got a mind control device on here. and the woman in the screen is actually controlling a quadcopter drone, just with thought. So if today this is happening, where someone can think about moving a drone through 3D space, can we imagine that you could also have the same thought and actually start to construct three dimensional geometry? Have things emerge in front of you?


Mardis Bagley  19:30

Yeah, almost, if you had multiple drones, like say hundreds of them at micro scale at nanoscale.

Greg Demchak  19:46

Or at this kind of scale. These are actual examples from 2018. So it's 33 years old, but these are examples of materials that are almost their own form of agency. They have micro-movement, micro-kinetics. So if we can Imagine that these materials now can have that kind of agility and motion, who's to say we can't get to the point where we conjure or having to, you know, imagine something and it gets created or AI computer generated systems could actually produce architecture produce form, produce solutions, that we ourselves, can't even begin to imagine, as a human. Very interesting,


Mardis Bagley  20:24  

I love the idea of doing it in sort of a generative model, right, allowing AI to build these for, you know, us. Suddenly, we have, you know, the art of the most efficient, right, and the art of the most efficient, strangely, tends to look like, you know, organic structures, because, you know, guess what planet Earth and organisms along it on it have been doing this for millions of years. But it's really kind of a beautiful way to let networks and mesh robots come together and develop things.


Greg Demchak  20:56  

And this is another example, this is from a colleague of mine, who now works at New Balance on new materials for the shoes. But he also is exploring, he's an architect by background too, the use of GaNS, which is generative adversarial networks. It's a very interesting form of machine learning, that takes an input from previous work and can basically generate new versions of that artwork, just through its own training algorithms. So in this example, this is an artist named Lebbeus Woods, he mentioned all these possible architectural futures, that he never built anything. In fact, he built a lot of 3D models and sketches. But now when you apply this type of machine learning to it, you can actually generate sketches that probably Lebbeus Woods would consider his own artwork. So if you have that, as an input and our cities can be produced through these micro nanobots that can organically self organize, you know, what does that mean for the future? 


Mardis Bagley  21:52  

I like this idea of you, you can actually start to do it again, biometrics are really, really important. We can start to gather data about ourselves, our habits, our moods, and suddenly, maybe I'm not just designing what, because we all when we when we design a house, let's say you're building your first house, you're going to take examples from existing house, you're going to say, Oh, I love that house down the block. I love Mies van der Rohe, the Bauhaus is great, and you're going to come up with this mashup. Maybe it's based not upon what you see in your external world, maybe it's based on your emotions and your attitude and your, you know, your temperature and all these different sorts of biometric feedback. 


Greg Demchak  22:34  

So mention that to like another thing I get to play with, by the way, in my day job. And this is again, back to thinking about what is now and what could 20 years look like. This is the Microsoft HoloLens. This is version two, head mounted, wireless computing device that projects holograms into your eye. And I can tell you this, on a daily basis, I am actually interacting with three dimensional holograms on a daily basis regularly. Because of these sensor networks, which are picking up my body motion, my head gaze, eventually, but you know, all biorhythms, everything else. One thought we had here is imagine in 20 years when this is literally just a pair of glasses on your head or a pair of contact lenses. If this is version two now and it's functional today. If you project this form factor to 20 years, you can imagine. One thing that I've realized working with this and it goes back to something think Steve Jobs showed when he first launched the iPhone, back to a material statement two, he made the point that when he released the iPhone, he removed all these physical buttons from the phone, right, the screen became the interface. One thing that becomes interesting, we have an entire world surrounded by sensors, in fact, sensors that are reacting to your body, you would no longer need to have a mouse or a keyboard or a screen. Because this is generating the screen for you in your environment. And any single surface can become that screen. In fact, the whole idea of buttons, mechanical buttons, becomes essentially obsolete.,


Mardis Bagley  24:13  

Well mechanical buttons never existed before the Industrial Revolution anyway. Find a tree with a button, I dare you, it doesn't exist, right? They don't exist in nature. We shouldn't have them in our products and eventually we're going to get rid of them. The human headphones that were earlier in my presentation, we removed all the buttons and it's just gesture controls, swipe forward or back, right. And I love kind of what you're talking about when you when you have the HoloLens on you no longer need a mouse you might instinctively still have you know this gesture because that's what we've been used to for generations now. But you know, maybe you get a block of wood, maybe you think it's a fine sculpted piece. Maybe it's a chrome sculpture or something, but it doesn't have batteries, it's not injection molded, it doesn't have all those toxic things we put in landfills, suddenly, it's the best sustainability story we've ever crafted. So VR, AR, these augmented realities, holograms, become a beautiful story about saving our planet. So let that sink in for a bit.


Greg Demchak  25:22  

Yeah, and only with this sort of video, this is a recent video I shot last week. This sort of drops the metaverse theme into it. But just to mention that it is real, like I actually ran this experience last week with a colleague. Different locations where we're working together with a digital version of ourselves having a collaborative design session. And in terms of materials, what we've completely eliminated in our interaction is any need for a physical keyboard, or a mouse or a digital screen. Everything that we're seeing is being projected as light directly into our mind, and we're communicating in real time to one another. So I think this totally changes the nature of interaction. And if we get into like graphene type gloves, the next level would be you could actually have force feedback interactions with your digital world. So that when you pick up that hologram, you have a sense of that you can feel it, you can touch it, interact with it,


Mardis Bagley  26:20  

Which brings up the whole idea of telepresence, right and digital twinning, right. So you could be connected with somebody that's on the other side of the planet, you could be connecting with someone who might be in a disaster situation and you're trying to help them out. You can calmly do whatever you need to do, share information, you can reach out to disaster situations again, and feel the environment around you. The haptic gloves have pressure sensitivity, and you can feel rocks or moving things as you please, maybe you can feel temperature as well, which makes us think, you know, we're not going to be to Mars for a little while. We're not going to be on too many planets for a little while. But we know, we've left our solar system, right with probes with satellites. But maybe we can start to pick up data along those journeys in outer space. So in deep space, if we can gather enough data, send the data back to this virtual environment that we've just created. We can mimic what's happening on any planet that we've gathered the data about. Suddenly, we're not just looking through telescopes, it actually feels like we're on that other planet. The temperature, the texture, maybe the wind, how about smell senses, right, smellovision, if you will. I mean, that's kind of a real thing. The best way to immerse somebody into a virtual experience, smell, touch, sound. Sound and smell are some of the best memory tools that you can ever ever imbue upon someone. 


Mardis Bagley  27:58

Think about that, you know, when you hear a song you haven't heard for a long time, and you love that song, you probably remember the last time you heard it. When you smell a cologne, I have a smell that somebody wore when I was in, I know who it is, when I was in high school. And every time I smell that one cologne scent, I know exactly the moment in time that I was there, I remember what happened. And then I'll never smell that smell again for 20 years because they don't make the Cologne, it's not as ubiquitous anymore. So I think it's pushing those boundaries of what VR is. It's not just about the eyes. That's only one sense. We're multi sensory creatures. synesthetic experiences, the overlapping of the senses is really important. I can tell you with the school that we did for Singapore, we realize, you know, Singapore is a meritocracy based, you know, system, right? You succeed because you got to A+ you get to go on to the next level, you get a good job. You know, there are a lot of brilliant kids out there that may not do well on tests, but they do fantastic at drawing. They do fantastic sculpting. Maybe they just learn differently. So if we can, we can align these sorts of experiences of VR to the neurodiversity of everybody that uses them. Mardis works better with visual, Greg works better with sculpting or something like that. You know, 3D is my thing, but maybe maybe words are his. How do we figure that out and apply it to that virtual sort of experience to heighten everybody's overall experience?


Greg Demchak  29:39  

Yeah. And then carry into the physical world, too. So how do we get to that big, Big Hero Six vision practically? I think it's a powerful vision. I think there's examples of that technology already starting to emerge. Maybe it happens through an event like this that scientists come together and actually start to solve that problem. I'm here today and then in 20 years, we'll be looking back at that and go, yeah, it was obvious.


Mardis Bagley  30:05  

I love it. I love it. Well, it's only gonna work if we all collaborate, right? You scientists, reach out to your designer friends, designer friends, reach out to your scientist friends, bring everyone into the fold, and create that conversation. Go to those really far out points and make yourself really uncomfortable. Diverge, and then converge on ideas and then do it over and over again. And this is what we found is key to innovating and moving very, very quickly.


Greg Demchak  30:34  

Cool. Thank you, Greg. I think that's, I think we’re at time. Thank you so much for your time. 


Mardis Bagley  30:37

Thank you, everyone.



Author

Related Contacts

Mardis Bagley

Mardis is a Creative Director in industrial design, striving to connect with users through hardware and brand experiences. He creates stories that resonate with users and has extensive experience with collaborative exercises, research and iterative processes that include sketching, 3D modelling and rendering. Mardis has managed contractors, manufacturers and has travelled overseas numerous times to shepherd products through manufacturing to create products that are smart and eco-conscious. In addition, Mardis is a graphic designer and visual artist with skills that include ideation, logo development, information flows, wire framing, visual web and app design, packaging, trade show and other collateral pieces.

Greg Demchak

Greg Demchak is an agent of change who is passionate about the role of technology with respect to how the built environment is designed, constructed, and operated. Educated as an architect at MIT, he has a deep appreciation for design, history, computation, and the evolution of design thinking. His design background has morphed into a focus on the user experience design for computer-based modeling systems, most notably Revit, Synchro, and the HoloLens. Greg has developed skills in agile product management, user experience design, hackathons, and customer-driven innovation prototyping. He is starting a new project within Bentley Systems to lead an innovation lab dedicated to mashing up various technologies in order to discover impact.

February 11, 2022

Two experts have a focused discussion about how future cities will look increasingly different from what society is used to seeing or experiencing. Due to the introduction of sophisticated artificial intelligence and 3D designing, future cities are being designed with sustainability, mobility, and the human experience as the core of the design, which has meant that many new cities have innovative features that seek to improve the way the city operates. Technology and science has enabled engineers and urban designers to approach future settlements with a keen eye for sustainability, smartness, and the use or reuse of frontier materials.

Click on the toggle above for the full transcript.

About PUZZLE X™:

PUZZLE X 2021 | Nov 16-18 is the world's first collision grounds for science, business, venture and societal impact. It brings Frontier Materials to the forefront to aid the Sustainable Development Goals set out by the United Nations by 2030.

Click on the toggle above for the full transcript.

View PUZZLE X 2021 program here.

Want to be a part of PUZZLE X? Register your interest here.

Design & Art

AI & MATTERverse

Smart Cities

Future

Construction & Infrastructure

Machine Learning

What SDG is this related to?

MATTERverse Activity

Mardis Bagley  0:01  

Can you hear me? Excellent. Life is so much better with one of these. Well, cool. Wow, what a day. So far we've seen 3D printed meat, right, not really meat actually plant based meat. We've seen a Nobel Prize winner for graphene, and some other amazing things. So thank you for sticking around for Greg and I, my name is Mardis Bagley. I am a creative director and founding partner of NoNfiction. NoNfiction is a design studio in San Francisco, California. And we like to say that while we design hardware products, it's really more about turning science fiction into reality for a better future. And this is what we do. We're in San Francisco, kind of that hub of innovation, one of many in the world. And we try to harness that power and use it to make an impact on the world. So that's me, I'll pass it over to Greg, for an intro.


Greg Demchak  1:08  

Yeah, thanks. My name is Greg Demchak. I'm with Bentley Systems, I run an innovation lab for a global software company, where I currently look at everything from mixed reality to digital twins, to how cities architects, engineers, and builders, use digital tools to build better cities and have better experiences with the built environment. I'm an architect by background, but I've been developing software solutions for that space for about 20 years now. So we're going to together go on a little bit of a journey, we're coming sort of outside of the space of material science directly, and trying to give us sort of a look at the future of materials from the perspective of a creative technologist and a software designer.


Mardis Bagley  1:56  

Thank you, Greg. Well, again, perfect intro, we are outside of the world of materials, but we interact with them all the time. So Nonfiction, as I said, we turn science fiction into reality for a better future. I've been doing industrial design along with my studio for many years, we've put a lot of products out into the ecosystem, we've sold a lot of amazing products. But that made me and my company step back and look at what we're doing. We're putting lots of things into landfills. So early on, we started deciding to do something different, we started to align our company with the United Nations Sustainable Development Goals. So every client we work with, we're a consultancy, every client we work with must satisfy at least one of those sustainable development goals. So eliminating poverty, bringing education, fighting, climate change, and so on. This is really at the core of what we do. And surprisingly, there aren't a whole lot of studios that really focus on that. So at our studio, we do industrial design, design engineering, experience design strategy, many more things, but also space architecture. We've designed things that go up into space, my partner is also an outer space architect trained in Houston, Texas, one of the only programs that is actually funded by NASA sponsored, so we try to live in that field a lot. 


Mardis Bagley  3:27

But today, we're going to talk about science, technology, art and design. And the real reason is because there's a bunch of brilliant scientists, material scientists, innovators, out there doing things that we can only begin to imagine. But to get that from point A to point B, or even point Z, that's a real challenge. You do need design, you do need well, I'll go ahead and say it here: science, technology, art and design. Science validates magic, really interesting, right? Technology creates those possibilities. Art triggers emotion. And design enables connection. And so this is really the true synthesis of how we get products to market that are very sticky. They're very adaptable, right when you're asking for behavior change from people, so like we have to save the world, you know, everybody has something else to do, everybody else has something else they're more comfortable doing. So if you put these four together, you have magic, possibilities, emotion, and connection that people are willing to buy into or that they're willing to fall in love with and actually move on with new products and new behaviors that hopefully at least in our mind, create impact for future generations. We're also hosts of the West Coast's largest material library sponsored by material connection. And part of why we do that is we want materials in-house and in our studio, we go and we touch the materials, we interact with the materials, we invite our clients and our vendors to come engage with it because it's very important to be very tactile. You can read all the blogs and watch all the YouTube videos that you want about a material, but until you're actually playing with it, it's not quite real. So this is part of what we do, we bring it to the forefront of our process.


Mardis Bagley  5:26

So, over the years, we've designed a lot of products for neuroscientists. Up here in the in the center, this is halo neuroscience. If I asked the average individual where their motor cortex is, they may not know, right, but when you start to put the put a headset headphones on, that has a stimulator applying electricity to your brain so you can move faster, you can learn skills faster as an athlete, this is suddenly a pivotal technology built into a use-case that's extremely familiar. So halo neuroscience is making differences that Olympic athletes are starting to see, you know, professional leagues of football and baseball are starting to see over here on the far right hand side is a Philips smartsleep. Also another product to help the brain find its best sleep. It stimulates and gathers metrics about it. But what's really interesting about that is it's not a hardware product, it's not an injection molded product, it's a soft product, we call it soft goods. And designing for the head and the ergonomics around it is extremely difficult. So how do we do that? We have to measure heads, we have to create a system that goes from 5% to 95% success on different size heads, males and females. 


Mardis Bagley 7:00

When we talk about nanomaterials, what's the next step of that? Maybe we can have a smart material, that instead of designing one headset, or 50 headsets that fits everybody, it evolves its shape and size. It decides, you know, we have enough trouble trying to find a pillow to make a mattress comfortable, what if your headset or even your pillow or mattress could shape to your position? These are the things that we're looking at right now. Over here in the lower right, we've developed a product that is called Calla Health. It's the first time ever FDA approved product that has been released in the market that helps people who suffer from essential tremors. Now this is really important because you don't have to put it on for a year for six months, through three weeks, you can put it on 20 minutes later, you go from not being able to write not be able to use a text based, you know, phone, to actually a grandmother writing a letter to their their grandkids or texting them. It's really transformational. And then finally over here, one of the products we've done is more about a first responders story. It marries augmented reality into the helmet of a firefighter and then it uses thermal mapping to do edge detection in the room around you. So suddenly, a firefighter can walk into a zero visibility fire, smoke everywhere, and see you know where the steps are, where the door is, where somebody might be laying there, go to rescue them, and leave the burning building alive. This is shaving seconds off of their ingress and egress to a burning building. All the while the data is sent up to the cloud, and back to the captain at the fire truck, where he's captured the 3D data and he's suddenly shared it out to everyone else on his team so they all know what the inside of a burning building looks like and they can be safer along the way. These are the fun things that we work with. This is very advanced technology, to be honest, but it's only the tip of the things we're talking about today. Our engineers and our engineering partners over the years of work for Apple designing the thinnest iMac, we've reinvented what it could be like if Apple had that reemergence into the marketplace. We've worked with fully invasive neurospace stimulators for epilepsy. Not an easy thing to do, but definitely very rewarding in the end.


Mardis Bagley  9:43

And then finally, nowadays, we're taking all that knowledge, we've done a lot of medical products, a lot of consumer electronics, we're starting to build them into larger systems. Up here on the on the upper left is is working with Nashville, Tennessee, what does it look like to turn Nashville Tennessee, already a beautiful place into a smart city where we want to integrate, you know, not only electric vehicles, but smart vehicles. Built in AI that understands the patterns of not only traffic, but also people. When an event happens, the patterns change about how transportation moves, suddenly the bus stops change. Because what we've realized over time is, so much of our technology is value engineered for the masses, value engineered to the lowest common denominator. And then what happens, people don't take transportation like they should, we want to elevate that, we want to bring pleasure back into this whole idea of like, you know, public transportation and smart cities. 


Mardis Bagley 10:48

Down here, the other thing we're working with is we're working on a school system for the country of Singapore. And this is really interesting, because Singapore already has a pretty fantastic school system. Why would they reach out to a bunch of designers to do it better? Well, because we're not educators. I've taught at a Professor level off and on throughout my career, but I'm ultimately a designer, because they want to break outside that proverbial box. Did you know that in most schools, the physical architecture is actually designed, they're square boxes and designed by the same architects that design prisons. And that's probably not a surprise, if you've, you know, spent some time in school. Well, we're changing that up, we're creating schools that are actually designed with biomimicry. Or in even biomorphism, we're bringing back organic looking structures. So not only are people more comfortable, you know, bringing things in green and soft, lowers heart rates and improves retention of knowledge. It seems kind of funny, when I'm saying this, it's like, yeah, of course, you know, people love the outdoors, they're calm, but it's not how we've been doing it for years. So we're changing that up and we use parametrics to do that. Right? So we can take our model and drop it in multiple different places, changing up the architecture. And then finally, on the right, why do buildings have to be static? Why can't they breathe? Why can't they respire? Can they, you know, change with the temperature? Can they change with how many people are in the room? So what we started doing with our project in Singapore, this school lifelong learning is what we call it, is we started attaching wearables to kids. So we're gathering biometric data, galvanic skin response, heart rate variability, we're using computer vision to detect moods. And these kinds of things can be pumped back into the physicality of the building, to actually open windows and change temperatures on the fly. So finally, the last thing, as I said, we're kind of all these things come in together into these larger systems. We work a lot in space, you know, trying to think about how, you know, if we're going to use the moon as a launch point for Mars and beyond. What does that look like? Do we want to go up and sit on a gray ball in space? Not fun. So hey, let's integrate some colors. Let's integrate plants, we can have plants that grow in microgravity, we can start to do these things. But again, it's gathering those microbes, those micro data points, to make sure that the architecture is almost living alongside everyone.


Mardis Bagley  13:47

We're happy to say that we just won an award from NASA and the Canadian Space Agency for redesigning what it looks like to create food over long term space travel. So these are all the things that we're up to. Why do I mention all this? Because all that's very high technology, right? There's a lot of really innovative things there. I'm a designer, do I understand it? I speak with all the architects, I speak with all the scientists and I try to figure this out. And quickly, me and my team ingest this information and regurgitate it in a brainstorm of ideas, and we push the boundaries. We know the material designers are fantastic at figuring out solutions, but they tend to go down a single pathway. We go really wide. We diverge in our ideas, and then we converge and we do this over and over again. And what we found is collaborating with a fully balanced team of designers, engineers and artists is the way we are most successful. Okay, I will pass that on to Greg.


Greg Demchak  14:58  

And then we'll get into a conversation here. So as a designer and architect by background, I went through this process of actually having to innovate the field of architecture and going from 2D representation as a way to build a building to a full 3D model where the 3D model is what actually generated the floor plan. I started that journey back in 1998 or 2000. Today, that product, which is called Revit technologies, is used by architects all over the world, but at the time, it was just an idea at the very early stage. And I've sort of progressed in that career building software solutions, trying to then imagine a future state, something that is not available today and imagine how they could scale up. And so what I'm talking about is like some of the more recent work, that’s right at the edge and sort of what could that look like if we start now and what could it be like 20 years from now? Can we have the same kind of innovation that happened with going from 2D to 3D? So the company I work for Bentley Systems, again, it's software development, it's 3D modeling, it's CAD systems, it's the ability to interact with 3D systems to represent the built world. So that's sort of like our playground, how do we interact with the digital world. And now we're looking at this whole idea of the digital twin, and the digital twin of our physical environment, the digital twin of our body, the digital twin of like, anything that we can interact with. 


Greg Demchak  16:23

The truth is now reality can be captured, like in near real time. So no longer are you making a 3D model, and then handing it over, it can actually be rendered in real time. And there's this entire new possibility of systems coming together to create these digital twins, and it involves absolutely everything. Humans, machines, IoT, information systems, yourself will become part of it. 


Greg Demchak  17:08

So I don't know if anyone has seen this film, Big Hero Six, has anyone seen this film from this materials lab? Much less than I thought so I'm not breaking the rules here. But I think it's also interesting to look at cinema and the imagination of filmmakers. So this is a film that came from Disney. And this kid gets up here. And he's imagining how nanobots if given agency, and if you could control it with his mind, like, what could he possibly do? So here, he's got this little robot that he's actually manipulating with mind control. And it will create basically any form that he can imagine in real time. Now, I don't know if this is going to be possible now, or the next 20, 30, 40 years, but the vision of this, I think, is super interesting. And the reason I want to show this, I'll show you the next slide. There's actually research happening right now that I'll show you that actually is already starting to suggest what this type of material could mean. I know we're talking before the session, if you if you have these types of nanomaterials that could just self-form, through thought that these chairs could actually have emerged on the stage for us like right now in real time, and then just sort of disintegrated into a regenerative material cycle.


Mardis Bagley  18:27  

Well, we also know that this ability to move things with the mind is possible, right? We've seen it tested and things like that. But right now, there's a company called Atom Limbs, that is actually taking the brain as well as nerve endpoints to help people move prosthetics with their mind. And this is really far advanced compared to the claws we've seen previously.


Greg Demchak  18:54  

And if we could roll this video, this is actual research done right now, where I think it's interesting, I tried to grab examples that are, this is only 145 views, by the way, this is research being done at MIT. She's actually got a mind control device on here. and the woman in the screen is actually controlling a quadcopter drone, just with thought. So if today this is happening, where someone can think about moving a drone through 3D space, can we imagine that you could also have the same thought and actually start to construct three dimensional geometry? Have things emerge in front of you?


Mardis Bagley  19:30

Yeah, almost, if you had multiple drones, like say hundreds of them at micro scale at nanoscale.

Greg Demchak  19:46

Or at this kind of scale. These are actual examples from 2018. So it's 33 years old, but these are examples of materials that are almost their own form of agency. They have micro-movement, micro-kinetics. So if we can Imagine that these materials now can have that kind of agility and motion, who's to say we can't get to the point where we conjure or having to, you know, imagine something and it gets created or AI computer generated systems could actually produce architecture produce form, produce solutions, that we ourselves, can't even begin to imagine, as a human. Very interesting,


Mardis Bagley  20:24  

I love the idea of doing it in sort of a generative model, right, allowing AI to build these for, you know, us. Suddenly, we have, you know, the art of the most efficient, right, and the art of the most efficient, strangely, tends to look like, you know, organic structures, because, you know, guess what planet Earth and organisms along it on it have been doing this for millions of years. But it's really kind of a beautiful way to let networks and mesh robots come together and develop things.


Greg Demchak  20:56  

And this is another example, this is from a colleague of mine, who now works at New Balance on new materials for the shoes. But he also is exploring, he's an architect by background too, the use of GaNS, which is generative adversarial networks. It's a very interesting form of machine learning, that takes an input from previous work and can basically generate new versions of that artwork, just through its own training algorithms. So in this example, this is an artist named Lebbeus Woods, he mentioned all these possible architectural futures, that he never built anything. In fact, he built a lot of 3D models and sketches. But now when you apply this type of machine learning to it, you can actually generate sketches that probably Lebbeus Woods would consider his own artwork. So if you have that, as an input and our cities can be produced through these micro nanobots that can organically self organize, you know, what does that mean for the future? 


Mardis Bagley  21:52  

I like this idea of you, you can actually start to do it again, biometrics are really, really important. We can start to gather data about ourselves, our habits, our moods, and suddenly, maybe I'm not just designing what, because we all when we when we design a house, let's say you're building your first house, you're going to take examples from existing house, you're going to say, Oh, I love that house down the block. I love Mies van der Rohe, the Bauhaus is great, and you're going to come up with this mashup. Maybe it's based not upon what you see in your external world, maybe it's based on your emotions and your attitude and your, you know, your temperature and all these different sorts of biometric feedback. 


Greg Demchak  22:34  

So mention that to like another thing I get to play with, by the way, in my day job. And this is again, back to thinking about what is now and what could 20 years look like. This is the Microsoft HoloLens. This is version two, head mounted, wireless computing device that projects holograms into your eye. And I can tell you this, on a daily basis, I am actually interacting with three dimensional holograms on a daily basis regularly. Because of these sensor networks, which are picking up my body motion, my head gaze, eventually, but you know, all biorhythms, everything else. One thought we had here is imagine in 20 years when this is literally just a pair of glasses on your head or a pair of contact lenses. If this is version two now and it's functional today. If you project this form factor to 20 years, you can imagine. One thing that I've realized working with this and it goes back to something think Steve Jobs showed when he first launched the iPhone, back to a material statement two, he made the point that when he released the iPhone, he removed all these physical buttons from the phone, right, the screen became the interface. One thing that becomes interesting, we have an entire world surrounded by sensors, in fact, sensors that are reacting to your body, you would no longer need to have a mouse or a keyboard or a screen. Because this is generating the screen for you in your environment. And any single surface can become that screen. In fact, the whole idea of buttons, mechanical buttons, becomes essentially obsolete.,


Mardis Bagley  24:13  

Well mechanical buttons never existed before the Industrial Revolution anyway. Find a tree with a button, I dare you, it doesn't exist, right? They don't exist in nature. We shouldn't have them in our products and eventually we're going to get rid of them. The human headphones that were earlier in my presentation, we removed all the buttons and it's just gesture controls, swipe forward or back, right. And I love kind of what you're talking about when you when you have the HoloLens on you no longer need a mouse you might instinctively still have you know this gesture because that's what we've been used to for generations now. But you know, maybe you get a block of wood, maybe you think it's a fine sculpted piece. Maybe it's a chrome sculpture or something, but it doesn't have batteries, it's not injection molded, it doesn't have all those toxic things we put in landfills, suddenly, it's the best sustainability story we've ever crafted. So VR, AR, these augmented realities, holograms, become a beautiful story about saving our planet. So let that sink in for a bit.


Greg Demchak  25:22  

Yeah, and only with this sort of video, this is a recent video I shot last week. This sort of drops the metaverse theme into it. But just to mention that it is real, like I actually ran this experience last week with a colleague. Different locations where we're working together with a digital version of ourselves having a collaborative design session. And in terms of materials, what we've completely eliminated in our interaction is any need for a physical keyboard, or a mouse or a digital screen. Everything that we're seeing is being projected as light directly into our mind, and we're communicating in real time to one another. So I think this totally changes the nature of interaction. And if we get into like graphene type gloves, the next level would be you could actually have force feedback interactions with your digital world. So that when you pick up that hologram, you have a sense of that you can feel it, you can touch it, interact with it,


Mardis Bagley  26:20  

Which brings up the whole idea of telepresence, right and digital twinning, right. So you could be connected with somebody that's on the other side of the planet, you could be connecting with someone who might be in a disaster situation and you're trying to help them out. You can calmly do whatever you need to do, share information, you can reach out to disaster situations again, and feel the environment around you. The haptic gloves have pressure sensitivity, and you can feel rocks or moving things as you please, maybe you can feel temperature as well, which makes us think, you know, we're not going to be to Mars for a little while. We're not going to be on too many planets for a little while. But we know, we've left our solar system, right with probes with satellites. But maybe we can start to pick up data along those journeys in outer space. So in deep space, if we can gather enough data, send the data back to this virtual environment that we've just created. We can mimic what's happening on any planet that we've gathered the data about. Suddenly, we're not just looking through telescopes, it actually feels like we're on that other planet. The temperature, the texture, maybe the wind, how about smell senses, right, smellovision, if you will. I mean, that's kind of a real thing. The best way to immerse somebody into a virtual experience, smell, touch, sound. Sound and smell are some of the best memory tools that you can ever ever imbue upon someone. 


Mardis Bagley  27:58

Think about that, you know, when you hear a song you haven't heard for a long time, and you love that song, you probably remember the last time you heard it. When you smell a cologne, I have a smell that somebody wore when I was in, I know who it is, when I was in high school. And every time I smell that one cologne scent, I know exactly the moment in time that I was there, I remember what happened. And then I'll never smell that smell again for 20 years because they don't make the Cologne, it's not as ubiquitous anymore. So I think it's pushing those boundaries of what VR is. It's not just about the eyes. That's only one sense. We're multi sensory creatures. synesthetic experiences, the overlapping of the senses is really important. I can tell you with the school that we did for Singapore, we realize, you know, Singapore is a meritocracy based, you know, system, right? You succeed because you got to A+ you get to go on to the next level, you get a good job. You know, there are a lot of brilliant kids out there that may not do well on tests, but they do fantastic at drawing. They do fantastic sculpting. Maybe they just learn differently. So if we can, we can align these sorts of experiences of VR to the neurodiversity of everybody that uses them. Mardis works better with visual, Greg works better with sculpting or something like that. You know, 3D is my thing, but maybe maybe words are his. How do we figure that out and apply it to that virtual sort of experience to heighten everybody's overall experience?


Greg Demchak  29:39  

Yeah. And then carry into the physical world, too. So how do we get to that big, Big Hero Six vision practically? I think it's a powerful vision. I think there's examples of that technology already starting to emerge. Maybe it happens through an event like this that scientists come together and actually start to solve that problem. I'm here today and then in 20 years, we'll be looking back at that and go, yeah, it was obvious.


Mardis Bagley  30:05  

I love it. I love it. Well, it's only gonna work if we all collaborate, right? You scientists, reach out to your designer friends, designer friends, reach out to your scientist friends, bring everyone into the fold, and create that conversation. Go to those really far out points and make yourself really uncomfortable. Diverge, and then converge on ideas and then do it over and over again. And this is what we found is key to innovating and moving very, very quickly.


Greg Demchak  30:34  

Cool. Thank you, Greg. I think that's, I think we’re at time. Thank you so much for your time. 


Mardis Bagley  30:37

Thank you, everyone.



Author

Related Contacts

Mardis Bagley

Mardis is a Creative Director in industrial design, striving to connect with users through hardware and brand experiences. He creates stories that resonate with users and has extensive experience with collaborative exercises, research and iterative processes that include sketching, 3D modelling and rendering. Mardis has managed contractors, manufacturers and has travelled overseas numerous times to shepherd products through manufacturing to create products that are smart and eco-conscious. In addition, Mardis is a graphic designer and visual artist with skills that include ideation, logo development, information flows, wire framing, visual web and app design, packaging, trade show and other collateral pieces.

Greg Demchak

Greg Demchak is an agent of change who is passionate about the role of technology with respect to how the built environment is designed, constructed, and operated. Educated as an architect at MIT, he has a deep appreciation for design, history, computation, and the evolution of design thinking. His design background has morphed into a focus on the user experience design for computer-based modeling systems, most notably Revit, Synchro, and the HoloLens. Greg has developed skills in agile product management, user experience design, hackathons, and customer-driven innovation prototyping. He is starting a new project within Bentley Systems to lead an innovation lab dedicated to mashing up various technologies in order to discover impact.

Create an account to unlock this story and more. Join today for free!

Create an Account

Already a member? Login here

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.