Interview with the Creators: MistySOM

Interview with the Creators: MistySOM

We'll admit, sometimes when a campaign is in its "Pre-Launch" stages - it can be hard to convey the marketability and benefits of the project when you can't hold it in your hand quite yet. Let alone, the knowledge and years of experience that MistyWest CEO (or "top dog"(?) - The MistyWest team likes to have a horizontal management structure), Taylor Cooper has. Cooper can definitely talk-the-talk and go over your head with his seemingly-effortless way of describing the metaphysical and computations of many projects he has been a part of. However, Cooper addresses the development and capabilities of the team's newest creation, the MistySOM, succinctly within our latest interview.

Watch the video below and check out the abridged transcript!


Brief Overview of the Campaign:

The MistySOM is built from the ground up to enable battery powered computer vision applications.

Harnessing the power of the Renesas RZ/V2L microprocessor with a dedicated Neural Processing Unit, MistySOM is capable of performing AI computer vision tasks at 50% lesser power than the other processors in the market. MistySOM delivers high-speed AI inference at low power consumption, accelerated by the onboard NPU that supports standard ONNX ML models. Images can be captured through the MIPI-CSI interface and h.264 encoded.

Feature Overview of MistySOM:

  • Low power requirements
  • Yocto Linux Operating system
  • AI accelerator
  • Cortex-A55 (Dual or Single)
  • Cortex-M33
  • 3D graphics engine (Arm Mali-G31)
  • Video codec (H.264)
  • Camera interface (MIPI-CSI)
  • Display interface (MIPI-DSI)
  • Two USB2.0 interfaces
  • Two CAN interfaces
  • Gigabit Ethernet 2ch
  • 2GB DDR4 RAM
  • Supports various versions of WiFi and Bluetooth interfaces on the I/O board
  • The board is compatible with the Renesas RZ/G2L processor for less demanding applications


The campaign is currently in "Pre-Launch"; the following dates are subject to change, but the team has provided the following estimates:

Alpha units will be developed and prepare for launch: October 25th
GroupGets Campaign for Launch: October 31st
Boards ready for delivery to backers: February 1st



We've worked with you and MistyWest for a long time, but can you give everyone else an overview? 


MistyWest's core areas of expertise include optics, embedded vision, a WC, Azure IAG, low power electronics, Bluetooth, Wi-Fi, cellular and satellite. Their services include engineering, design, embedded systems, specialized research, industrial and UX design. 

Cooper: I think you really hit the nail on the head there. I don't know what else to add really, other than to say, you know, we offer engineering services for product development. If there's something you want to measure, if there's some place where you need data and you need hardware to get that data, we're good company to come to talk to.

I've been working in product development for like 15 years now. Kind of obviously started out as an engineer, and I eventually did some sales work and ended up leading MistyWest. Though I'm not really a CEO, I'd say we're a holocratic organization. I'm just the person with the closest job title to CEO.


Your team has taken quite an undertaking with the MistySOM - why did you develop this?


Cooper: We're a services business. We help people build IoT devices. And as you all know, towards the end of the pandemic, "Chip-Mageddon" struck and a lot of our customers and clients were just not able to find the parts that they needed to build their products. XP, Broadcom and a lot of these solutions were just kind of going out of whack early or having lead times out beyond 52 weeks. 

We had to look around to see who was less impacted by the chip shortage. And one name really popped out and that was Renesas. They have a multi-fab situation, so they work with Samsung, and they also work with TSMC. They have a factory in China and Japan that they've maintained themselves. We're not the only ones who jumped on the bandwagon. I think if you look at other manufacturers, you'll see them either releasing or starting to release Renesas part lines as a result of this. But we decided, you know, we were going to start building a sample to really address those problems from our clients. And so as we started doing that, it's been a whole bunch more conversations that have come out.

We've decided to launch a GroupGets campaign to basically open this up to the world and put it out there so more people get access to it. We're specifically building the first version based on the Renesas module as well, and that's a computer vision based SOM. But the other reason to select that solution is because they're pin compatible and at a lower cost. It's a pretty standard system on a chip solution that's kind of comparable to a Raspberry Pi. In a lot of ways, it can do a lot of the same things. So we're looking to build basically a minimal footprint, low cost sample that'll be appropriate in a lot of applications. We're going to support it with the carrier board as well so that you can grab it and get going.'s "Bald Engineer" wrote up an article regarding it as: "50% of the power, at 100% of the compute power" - how does that encapsulate the MistySOM?


Cooper: A great way to think about if this is good for your application is, you know, are you doing computer vision that requires intelligence at the edge? And are you currently using the Jetson Nano or something similar to that or a Raspberry Pi compute module?

The difference between what we have, compared to the Jetson Nano, is that the Jetson has a general processing unit, so it has a GPU component and you can use that to do computer vision. It'll have slightly better software/firmware support. We have a dedicated part of the silicon that is great at the algorithim.

Say if you wanted to do object detection or image segmentation, you can basically train your algorithm in TensorFlow. You could do it in Sagemaker in AWS on your laptop. There's a trainer conversion tool that will convert that to an X format to a binary that can be uploaded to to the strip and it will run that.

One of the things we did at EmbeddedVision this year was demo a comparison that MPU against the Jetson Nano, and usually it's below 50% of the power consumption. And the reason for that is because the GPU is general purpose, whereas an MPU is dedicated for a really specific type of linear algebra operations needed to do this kind of edge compute. It doesn't have to move information around as much and as a result, is much more efficient.

For comparison, the Jetson has a big heatsink on top of it. We have no heatsink whatsoever, so it's pretty significant power savings and it has about the same number of teraflops. So if you're thinking battery power and computer vision at the edge, this is a great option for you. 


Now that in-person conferences are on full swing again, how was Embedded Vision 2022?


Cooper: It's really exciting. There's a lot of really interesting companies that we were lucky enough to be in the same Renesas booth presenting with them.

At that stage, we were still doing some of the design tasks. There is definitely quite a bit of interest. You know, it is, it does have a specific application in mind so you know, it's not SOM for everything, but the things that it does do, it does really well.


You currently have a call for alpha testers and that is was that that link and that information is located on the campaign page. Who would be best to apply for this?


Cooper: We're open to anyone who really is excited about this. In terms of who would make a great alpha tester, it has to be someone who has a bit of technical competency - someone who is familiar developing with Linux and things like that is going to have a better time with this start.

Documentation will be like an ongoing thing. We're working on it. We are hoping to provide support for this. But if you have an application that is battery-powered, computer vision and you need compute similar to the Jetson Nano -  talking to us is the simplest way to say it in terms of applications. You know, right now we're talking to people with applications in smart cities, sports cameras, wildlife tracking, and preventative maintenance of factories.

We've partnered with a local company called Novax Industries. They're a North American leader in pedestrian safety equipment: think of speakers that announce when you can walk or signs that indicate or buttons you can push, things like that. We're working on with them on a project to basically create a pedestrian tracking solution where the intelligence is on the edge. Instead of having to send a video stream up to a server and have someone pay like $2,000 a month, the idea is to run out at the edge and and do inferencing there. And there's also some benefits around personal privacy, things like that. You don't have to send images of people to a server. There's lots of people in the space, obviously, but I think a solution like this is going to have some unique advantages. Having a low power solution when somebody is there and present, you can detect them, take that video. And when they're not, you can operate in a low power mode. 

For a wildlife tracking application, we're talking to another company that does bycatch monitoring on fishing boats. You can imagine a camera system that's installed on a fishing boat constantly checking to see if there's any bycatch like that. They accidentally catch a shark when they're trying to catch salmon, things like that, because sharks are critically endangered right now. They just have like terabytes and terabytes of data. So they have a bunch of hard drives and they just take all those hard drives off the ship and someone has to go through it manually. And it's a huge to-do! You won't need a data center of hard drives, and deploy this technology.

There are all these applications on the computers inside for the RZ/V2L, but almost in parallel to this, we're going to be releasing a G2L SOM. It's going to be identical. It's also going to be industrially rated for temperature. The carrier board will also be industrially rated for temperature. You can stick these things in a box and have them run in adverse conditions that you can run as an IoT hub.


Follow MistyWest for more updates and news about the MistySOM!





Back to blog