Smart Cities: AI & Predictive Modelling Meet IoT

IoT Leaders with Nick Earle, CEO of Eseye, David Ly, Chairman & CEO, Iveda

What’s the winning recipe to successful smart cities? People want safety, security, and convenience. If city managers can provide all three, they keep people happy.

But what if we can take smart city management one step further? In this episode of IoT Leaders, David Ly, the Chairman and CEO of AI-driven video and IoT sensor solution company Iveda, dives into the latest developments making smart cities an imminent reality.

The show explores:

  1. The lessons learned on a long journey from refugee boat to tech powerhouse
  2. What it takes to build predictive modelling solutions key to smart city development
  3. How necessity can be a real driver of innovation and success
  4. Why processing at the edge is so critical in solving new technology challenges

Tune in to hear how Iveda pioneered the use of the cloud, big data, ML and image recognition, and how it’s using the latest AI developments to build smart cities for the future.

Subscribe to IoT Leaders

Ready to take the mic?

Join us on the IoT Leaders Podcast and share your stories about IoT, digital transformation and innovation with host, Nick Earle.

Contact us

Transcript

Nick Earle:

Hi, this is Nick Earle, CEO of Eseye and welcome to IoT Leaders Podcast. Well, very interesting episode we have for you this week. We’re going to start this week in Vietnam, and then we’re going to go to California, San Jose, and from there to Arizona. And I’m talking to David Ly, he’s the CEO of a very interesting company called Iveda, and you’ll hear his personal backstory of how he escaped on a boat from Ho Chi Minh City and went to Japan in fact, and then to the US, but built one of the most successful and innovative video companies. He has floated it twice in the US, re-IPO as he calls it, and has been a very early pioneer in not just video surveillance, but machine learning and now AI for smart cities. 

And the vision, what they’re actually implementing and working on, is creating a set of services to make smart cities a reality, to bring new revenue for the people who manage cities, to bring a new set of services to citizens and visitors to those cities, and just to make everything work much more efficiently. And as David puts it, “And make you even happy to pay your taxes if we can do all of that.” It’s a really interesting business and personal story. David’s based in Mesa, Arizona. Here is his story, and I think you’ll find it very interesting and stimulating in terms of where we’re going in the world of IoT and video. Here it is, enjoy. 

David, hi. Welcome to the IoT Leaders Podcast. 

David Ly:

Good to see you again, Nick. Thank you for having me. 

Nick Earle:

You’re very, very welcome. I know you’ve got a really interesting story both from a business perspective, it’s actually a real great success story, building a company, floating the company, but also from a personal perspective. So we’re going to go into the whole world of video here. But before we do, as I mentioned, you’ve got a really interesting personal background and I know you don’t have a problem talking about it because you’re very proud of it, and rightly you should be. So before we get into Iveda, maybe you could just recap for our listeners, our viewers your own personal background, which is really quite amazing. 

David Ly:

Sure. And thank you for the opportunity. Well, I always start by telling people, and it captures everybody’s attention, I tell them that I’m what you call a Vietnamese boat baby. Why that was coined was I actually had to escape and my grandfather bought me passage. Of course, my aunt having to take me back then because my father was in the military and he had to run and hide in the jungle just after the fall of Saigon. This is 1975 was when the US military pulled out. Long story short, I was about four years old and we hid near open water or where we can escape into open water, and I was part of about 70/80 people sardined in a fishing boat and escaped out to open water. That’s why we call it the boat baby. And in open water back then, I don’t know if most people know, there was no destination other than just float out. 

Nick Earle:

Just to get out? Yeah. 

David Ly:

Yeah, hope that you get rescued. And fortunately, Nick, my boat was rescued by, I recall it was a Norwegian fishing liner, real bigger boat. They picked us all up, rescued us, but stopped and dropped off in Japan. So I chose to stay in Japan and others chose to go back to Norway. Why was that? It was close, my aunt actually spoke Japanese back in the day through her teachings. So it was all very fortunate and divine support. I was a refugee in Japan for a year and got to immigrate to the United States due to another aunt in church sponsorship in California. So at the age of five, I immigrated to the United States, started kindergarten, perfect timing again, and the rest is history. I grew up in San Jose, California, went to college in San Francisco and I did the typical American thing, got the college degree, found a job, worked the job, and decided, “Hey, what else is there?” So fast-forward to 2003, I founded Iveda, the company that I’m running today. But yes, that’s the boat baby history. 

Nick Earle:

And I’m sure that’s the short version, there’s a lot more you could fill in. I mean, it’s just amazing and I wanted to highlight it because a lot of the things that you’ve achieved since then are probably based on the learnings and the resilience that you learned during that period. So thanks for sharing that. 

Let’s talk about Iveda. And I mentioned at the beginning it’s all around video, but what you’re doing now in the areas of artificial intelligence and smart cities, which we eventually we’ll come onto, is a long way away from what you were doing in you said 2003 in video. Actually, just on a personal note, I was based out in the Valley for about three, four years and it was just before I got out, but not the same way you got out, I left Silicon Valley to come back to the UK and I was based just outside San Jose in Saratoga, so I must have left late 2001 so just before the crash. Not that I could predict it, I just happened to, I left and then it crashed. So I know the area very, very well. 

And so why video? How did you start out your first product or first offering? 

David Ly:

Sure, and just coming off of 2001 when you left and when that was the fall of the .com there, let’s say, actually that’s when I got laid off at one of the wireless companies I was so proud to work with and work for. So yeah, it was really a downturn. But part of that was what motivated me to go out and do something on my own. So really fast-forward from that layoff of 2001, that’s when I moved to Arizona. I got a new job, moved to Arizona, headed up the southwestern region, hence Phoenix, Arizona. 

But I met a security firm, a physical security guard company pretty large that handled the region here and it was through them that gave me the idea, right? They had a lot of issues where we were sending out mobile patrol services on Safeway paper bags flapping against the fence, it triggered alarms. Now mind you, it’s hot here in Arizona, you got to start the car, the AC, it takes a while and it’s consuming a lot of resources. The idea was can we verify, can we see video, look at something to make sure that, did somebody cut the locks or was it just a Safeway plastic bag flapping in the wind? And that’s what I did. I helped this company, I started moonlighting with them and put up a Logitech webcam from an AT computer back then and gave it an open port. And we viewed any sample alarm from Phoenix when that alarm location was in Tucson. So you can imagine that was really cool. 

Nick Earle:

That was breakthrough, breakthrough and radical. 

David Ly:

Yeah, I mean one camera and that was like, “Wow.” But it served its purpose, immediate value recognition, and that’s when it started, that’s when the idea dawned on me. And I’m not a software guy, so I called one of my good friends from back in school, he was out in California, again we put one video screen and we wrote some quick CGI web deal back in the day and put four cameras on one screen, right? Four cameras from four different physical location on one screen. That’s the company. It started, it’s like, “Holy cow.” And from relationships of that security company, we had more end users buy into it. They just had issues, recognize it. They didn’t quite understand how it all worked, but they said, “Look, if my property can benefit from it, let’s do it.” And that’s where it started scaling. We were one of the first companies, I went out and bought some IP cameras, built some servers. Now the limitation back then, Nick, was bandwidth. We’re still talking DSL, ISDN, T1 was like the big deal back in the day. 

Nick Earle:

And expensive. 

David Ly:

Yes, very expensive. I mean, what we paid back then for T1, I think we’re getting like 50 gigs right now. 

Nick Earle:

Probably more, yeah. 

David Ly:

Yes. So with that said, we were one of the first companies, Iveda got into big data because we ended up hosting video. We hosted customers’ video so that they did not have to handle, manage, or maintain a box. And that box was the recording device, whether it was hard drive, we eliminated the box on site by taking video into the cloud. And there you go, that’s when Iveda really started expanding. We were one of the first companies that hosted a data center that people called cloud back then, right? Buzzword. And we charge people a hosted service fee every month, and that evolved the company into a real-time monitoring company. We now have customers’ video data on their behalf, they access it over the internet at any time. Then it came that, look, nobody has time to watch video and all these alerts were just a burden. So Iveda had a service where I had live humans watching 24/7 per camera per hour paid by the customer and we made reports manually, Nick, back in the day. We actually- 

Nick Earle:

David, that does not sound like the world’s most exciting job right there. 

David Ly:

But you’d be surprised, it filled a lot of curiosity, and back then remember the voyeur cam concept back in the day. So people like legally become voyeurs, okay? What’s going on? So that said, it scaled very well. We actually had a lot of customers, but this is where I saw the bottleneck. The human eye can only watch so many squares on a screen and then lose efficiency, we start missing things. 

Nick Earle:

We know. I remember when we met for the first time to prep for this podcast, I said, “Well,” I used the story of one of the most viewed videos, I think, on the internet is the gorilla with the basketball. And they say to people, the guy with the basketball, count how many times they pass the basketball between four or five players and you have to count. And during the middle of the video, a guy in a gorilla suit walks on, waves at the camera and walks off. And something like 90% of people when asked afterwards, “Did anything unusual happen on that basketball court?” 90% of people said no, I didn’t see anything at all. 

So if you’ve got guys watching, guys and girls watching people’s cameras of the outside of their premises, I guess they miss stuff and it’s tiring just staring at screens. And as you say, there had to be a better way of doing that. But the technology wasn’t there at the time to… I mean, we’re talking early here, the technology wasn’t there. We’re not talking about the artificial intelligence that we have now to be able to recognize images and distinguish between images and replace the human completely. So how did you start off solving that problem? 

David Ly:

It indeed was a problem. It was a human problem that couldn’t be easily solved. We had multiple shifts, we switched off on multiple breaks, lots of physical activity to try to minimize the loss. We got to see the gorilla, we can’t miss the gorilla. That was the objective. So there were analytics back in the day, if you recall, right? If something moved, object movement, you can pixelate certain areas of the screen to narrow it down. So we did what we could, but we’re still missing that gorilla a lot. 

So we got real, and this is where the beginning of machine learning software came in to where you can teach object recognition. What is an object? It could be anything, a basketball. But in our case, we needed to identify humans versus paper, plastic bags, dogs, et cetera, moving tree branches. So we were able to implement object recognition early back then in 2008 that recognized a human silhouette or anything that would be 90% plus human. That was the only time that our trigger alerts came so that our monitoring personnel, we call them intervention specialists back then, paid attention and it helped a lot. And that was really the spark I would say that evolved Iveda from a big data company into a true machine learning organization by obligation. 

We wanted to enhance our business, we had to do something unique. So we got some developers and expanded on traditional video surveillance analytics. We built recognition that became self-learning. It had to teach itself, it had to improve on its own. Of course, we had to feed it data still, right? It did not know how to grab its own data like machines today. So that’s where we evolved, Nick, from a video surveillance IP cloud video company into now that AI, IoT smart city company is that there are cameras all over the world, and you being in the UK, you know better than most. 

Nick Earle:

I think we’ve got more cameras per head of population than anywhere. And by the way, nobody complains. We won’t allow people to have a national identity card, we think that’s a huge infringement of our personal liberty, but we’re okay with there being 4 million cameras in London and you get filmed 200 times a day or something crazy. 

David Ly:

Yeah, and given the AI now, the AI does what our human intervention specialists used to do, except it does it, if you can believe it, 99 to 100% now. We’re not missing the gorilla anymore. We’re now anticipating when the gorilla might show up in that analogy. So AI now really is well-defined to almost any or all objects that you wish for it to recognize and learn. So in our business now, we evolved to not just providing that service and being able to have cameras that are smart. 

We decided to step out of the physical hardware realm and really expand the world by saying wherever you have a camera, wherever you’re needing a camera to do something more for you other than just record, that’s what makes Iveda AI and the technology that we offer now, Nick, so valuable is that we can be leveraged with any existing physical infrastructure out there today. So you don’t have to buy anything new, you don’t have to rip out or have an overhaul of your existing systems. We come in and play right there in parallel with what you’ve got and we make everything you have 10 years old smart. And that’s how the cities are able to evolve a lot quicker when we’re talking smart city expansions. 

Nick Earle:

So let’s unpack that a little bit because that’s interesting because you covered a lot of ground quite quickly there. If I can just take you back, if I’ve got the timeline right, about 2008 you started saying, “Well, instead of humans staring at screens early days of ML,” and then you jumped forward to instead of saying it only works with this hardware, because one of the big issues which will come onto is people haven’t got the budgets for new stuff all the time, what they really want to do is integrate to the legacy stuff, but all the legacy stuff is from lots of different manufacturers, and so you have to solve that problem. We’ll come back to that. 

In between those two, I just want to go back to you being a pioneer in ML and what became known as AI. And of course you were doing machine-to-machine and it became known as IoT. So did you get some help? Isn’t there an angle here that you actually also got some help back to Asia, your story about Japan, you actually got some help, is it the academic world out there to really create things that didn’t exist? Because this software, this object management recognition and all of that, it wasn’t off the shelf, it wasn’t available like it is today. You had to go and kind of find people who could build it or help you build it. Is that right? 

David Ly:

I think you took better notes than my own memory. You are correct. About that time I had the opportunity to be introduced to an institution in Taiwan by the name of ITRI. It’s the Industrial Research Institute of Taiwan. And it was one of our company advisors and the gentleman took me out to Taiwan and just showed me the market out there. It was like London, cameras everywhere and in America, the US, that was new to us. But I got out there, I was wowed back in the day, right? When I looked up and said, “Wow,” that meant something. 

And yes, I was introduced to an organization that was government funded and they were an incubator, Nick. They did everything from biofuels to solar, alternative energy and developments such as machine learning and artificial intelligence. That was very, very new back then. So given the opportunity, I was able to recognize technology that has yet been utilized in the market. That’s what it was, is that technology was built, but it was more academia, it was more research and development. But I got to sit in a room and saw how a machine can learn on its own. That’s when I said, “Oh my goodness, this is where it could be real time. This is how the real world operates.” We don’t know what to teach. We don’t know when to teach, but if we can apply real world problems at any time and tell the machine this is a problem that needs to be solved, really that was the concept. 

And Iveda was fortunate enough at that time through the proper relationship, financial positioning, we partnered then acquired what people know today as Iveda Taiwan in 2009, and that was the point of us going public for the first time. We needed the public capital to make that work. 

Nick Earle:

Was that on the NASDAQ, David? Is that right? 

David Ly:

Yeah, we were too small back then, Nick, we were too small back then. So we were on the American OTC, Over the Counter. 

Nick Earle:

Okay, the OTC, yeah. 

David Ly:

Recently, a couple years ago, we up listed onto NASDAQ and did what would be coined the re-IPO, right? And did that as well. But yes, you are so right and we until this day have what’s called an evergreen relationship and first rights of refusal to innovation. This is why a company like ours still so nimble, so flexible, yet we can launch technology so fast based on the ability to address real world customer issues with technology that not too many people get to see at this point in time. So that’s the benefit here at Iveda. 

Nick Earle:

And I want to get onto, as I say, the smart cities piece and the AI and the vision, but just to stick on this for a while, what gets me and impresses me is that you would doing it with images. And the reason I make that point is around about that time, met several people who were trying to do similar things, but most of them were sort of saying, “Well look, let’s take your call center or your people raising trouble tickets on the support business. If you’ve got all the answers written down in manuals, well you can get the computer to read the manuals because you’ve got everything written down, the documentation, and then you can actually feed it the 10 most frequently asked questions and these guys can spit out the answers without talking to a human.” 

In my example, when I was running worldwide global operations for the services business with Cisco, around about that time, a little later actually, Cisco with 50 million boxes out there were drowning in support calls and they managed to automate about 90% of support calls where you talk to a piece of software rather than a human, but the way a router or a switch or a wireless access point or a telepresence system works is very fixed. There’s lots of different software variance underneath it, so it is a much bigger problem, but basically the rules are written down, the engineers have codified it. And so I could see how that could be automated and they got to 90%. 

In your case, you’re talking about images, which seems to me to be a much harder problem to solve. I guess I’m right in saying that you must be one of the pioneers in this space, to be able to get a machine to learn in the image space as opposed to just prove it can be fed codified information and learn. Would that be right? Were you one of the pioneers in this space? 

David Ly:

I’d love to take your word and sit and accept that that yes, we feel that we are and I’m proud to be able to speak of it. I’ll tell you, we did a lot of things just simply by obligation. I’d love to say that I have planned this and I was the whizz kid behind all this wonderful stuff. But no, we had a lot of problems we needed to solve. But one thing we did have was we had a video infrastructure, we were in big data. You can imagine we had thousands of cameras floating all over, we had a lot of data. And what I had to learn was data packets, coming from the packet data world in the late ’90s, early 2000, that gave me the concept and notion that look, if you need to collect, we need to teach a machine something, that machine need to learn how to receive constant data flow. 

And in our world of video surveillance, RTSVP, real-time streaming video protocol, and that was when developers and engineers who developed machine learning in the institute that we had help from, they weren’t video experts, they were machine learning experts. Going back to the IBM Watson, you feed them something, you need a response back, that’s better. But now you take real time streaming protocol and you feed that and you have a mechanism that can record, that can learn that, can understand what each of that data packet means and compare it against the next one that’s coming in and the previous. I’m hinting to you there about how this stuff works and because we were able to deal with just RTSP, that took me away from needing to build my own cameras. I can use anybody’s brand, anybody’s make because that’s an open protocol despite the encryption. Once you have that, you’re good to go., You can process it. So that’s where it all came from. 

Nick Earle:

You said you were able to pivot from the hardware layer into this software layer because there was an open standard and this idea of prediction to base on what came before, what will come next, which actually of course is the way ChatGPT works, it predicts the next word, which is why when it goes on the screen, it’s quite jokey, it’s just predicting the next word. But to do that with pixels in effect, portions of the picture, is very cool. 

All right, let’s jump ahead. I wanted to just deep dive on that because I think what you do that’s really different was really forged in that era and now the application where it really does cross over into IoT and in fact what Eseye does because we connect things, millions of things and then send the data into the cloud, many of our customers are using video, but what’s happening in the connectivity market to put a bit of context is that the intelligence is having to move to the edge because particularly in a large scale video deployment on IoT, and you know this extremely well, for instance, if you’re using cellular connectivity, you can’t backhaul the data to a data center a long way away, interpret the data, make a decision, and then send the decision back out again. You need to do as much of this intelligence at the edge. 

Now, if you then take that video and as you say the smart city, if you’ve got 10 cameras, you’ve got 100 cameras but when you start getting into millions of cameras and things which aren’t cameras which need to display video like bus stops, street furniture as they call it and whatever, you’ve got this massive edge processing issue to deal with. So as I understand it, this is a big area for you now, the whole smart city management and the benefits for the people, the city planners, and translating that into a revenue benefit for them to make their city smarter. Maybe you could talk about that a little bit. 

David Ly:

Yeah, you’re so right with the abundance of devices out there and internet of things, there’s so many different things that are now connected to the internet. You do have to always consider bringing the processing to the edge. And it’s more prevalent today than we did talk about it 10 years ago, right? So Nick, Iveda is there. We have what we call smart gateways and you know within a gateway, whether you put in a multiplexer to define what fiber routing might be, where it routes, who it routes to, or you can define what Mac address one should be processing for whatever type of data. So the Iveda smart gateways allows us to manage these issues where bandwidth becomes an issue where the packet of data is a little bit large to squeeze through the normal pipe. We do invest in that and we do implement such technology. 

But where it’s all going, I think I want to take you into real smart cities, right? People wonder, it’s one thing to collect the data and process it, but what are you doing? We are seeing our momentum, let’s just say that momentum equals growth, growth equal future success, but let’s talk momentum. Cities now need to understand that as they invest in technology, what’s in it for me? That question people tend to always forget when we’re technology guys preaching technology, we get so excited we forget what it really means for the customer. But again, I’m obligated and forced to in front of the customer, “David, you sound great, but what’s it mean for me, dude?” I mean, I get mayors asking me that bluntly. 

So I found an answer that really works because it’s the truth. If we can help any institution, municipality, or organization generate more or new revenues, Nick, that’s the key to smart cities. A city has so much data potential, what do you do with that data? Can you get a bus somewhere quicker? And while somebody’s waiting at the bus stop and normal predictions say that, “Hey, they’re heading to this destination,” why don’t you advertise that there’s a nice cold latte waiting for them at the destination at which store? I’m not going to mention a brand for promotion, but so many things that one can do. Advertisers, the traditional advertising revenue concept. People might think that’s old for the internet, but for a municipality to charge corporations for advertising through citizen mobility, Nick, it’s so awesome what we can do. 

And it’s the smart gateways at the edge, guess where we can place them? Existing infrastructure. Light poles that have sufficient and adequate power can be turned into smart poles, right? Another thing that Iveda is doing. So as I’m sharing with you and walking you down this path, you’ll see that there has to be leverage of some kind of existing infrastructure to start. You don’t magically go in and sell somebody a smart city, right? You got to leverage what they have, give them some value, explain how you’re going to interconnect this infrastructure, as a result, you’ll receive more of this. That’s the conversation we, as technology companies, need to have, yes. 

Nick Earle:

I found it interesting the way you said that the private companies have been doing it, but for cities, municipalities, this is revolutionary and breakthrough. So as you were talking, I was thinking about one of our bigger customers, it’s actually Coca-Cola and their coffee line. They bought a British company called Whitbread Brewers, but they bought the coffee business and the coffee goes under the brand name of Costa. 

And the reason for saying that is that one of the things that Costa machines have is they have a huge digital display on the machine. And when you hold your phone up to scan the QR code to get your loyalty points, what they’re actually getting is your identity. And the reason they want that is that they want to know your personal coffee buying habits and what time of day and what ingredients, what buttons you press so they can do exactly what you said, which is that it’s 10:30 in the morning and David hasn’t had his ice latte. Over here in the UK we wouldn’t drink ice latte because it’s so cold, we’d want an extra hot one, but the same principle. 

And so that data is incredibly valuable from a marketing point of view. The point is a private corporation is doing it. By the way, the contrast is if you go into Starbucks, they write your name on the side of the cup and then they shout, “Ice latte for David,” and you grab your cup and you leave the store, at which point they have no idea you were ever in the store because they don’t get the credit card personalized information, they just get the payment and your name and your identity, walked out the store in David’s hands. So it’s kind of a marketing play. But the long tail personalization of data. But the point is it’s a private corporation doing it. 

Now, when I use the phrase street furniture with smart cities, of course you can only have so many coffee vending machines in a city. Even in London, I don’t know, maybe there’s 5,000 Costa Coffee machines. But if you take all the infrastructure that a city owns, and I talked about bus stops and elevators and streetlights and everything, traffic, they’ve got so much infrastructure, their potential if they could gather the data from it to create a revenue model is enormous because that data’s hugely powerful. Because we’re not just measuring your coffee choice, we’re measuring what happens in a city. 

And it seems to me what you’re doing is you’re also predicting traffic movements or events in the city. You’re not just predicting the next pixel on the video like you were talking about, you could gather so much information you can predict what’s going to happen next on traffic based on many, many more data points than, for instance, Waze is an app that does this, because Waze only has the data from the people who use Waze on their phone. So you actually could start enabling a level of prediction that’s not been seen before. 

David Ly:

That is correct. We are working with a predictive modeling tool at this point. And Nick, you just talked about the future, right? People walking their dog down the sidewalk, how many people crossing the street at major intersections in cities like London, New York, Tokyo, I can’t talk about that here in Mesa, I think I’m the only guy crossing the street over to McDonald’s on a cool day, right? But you talk about human traffic every day, automobiles, mobility in general, how our public transit systems mingle with our human traffic and private vehicle mobility. Right now, if you can imagine, I love it, you said city furniture, right? Fixed furniture and infrastructure like light poles with Iveda’s smart video, with AI, with sensors that are tracking city assets like waste trucks, garbage trucks, and lots of things, when you put it all together and have different sensors out there and devices out there that provide relevant data for city management to take action on will mean so much. 

But I’ll tell you where we’re going with this is yes, there’s a lot of ways to predict traffic, car counting. In the past, people used to toss across a rubber line that whenever a set of tires hit, you know it’s a Ford vehicle or an 18 wheeler, right? We still do that today, Nick. We still do that. So imagine having the AI video counting vehicles, not just counting cars but counting how many motorcycles, how many sedans, how many trucks, pickup trucks, semi-truck, et cetera. That counting is one thing. The timing for which it moves throughout the day on that street and every corridor throughout the city is another. We are using predictive modeling now in our internal testing laboratory. I can tell you that a blue Honda Accord is most likely to make the right turn on Christmas Eve, 2004. That’s what we’re getting down to. 

But we’re not there yet. Why? The data sets are still requiring too large of data over too long of a period of time. So as we continue to perfect this, Iveda is working on predictive modeling to where we require minimal data. That’s where intelligence lies right now. And I’m sure as you talk to more folks, I hope they would agree, it’s not that people can’t do this right now, it’s getting it down to the efficiency for that immediate gratification, Nick. People want to know things now and when you tell them they got to wait three months or you got to spend six months to collect certain data, it kind of bores them now, especially now in 2023. 10 years ago, we might’ve accepted that. But we’ve got some skunkworks projects right now that are in play that’s going to really be very exciting for cities to come. 

Nick Earle:

Well, I won’t ask you to reveal anymore, but anyone listening to this can see the potential and it takes IoT away from just the product domain, which as I said in the case of coffee machines or whatever, into sort of mesh type network, the smart city promise where a whole set of services are delivered to citizens and a massive boost in the revenue potential for cities, but also for companies who will want to buy that data because that data will become one of the biggest revenue sources for cities. That means they will have to think differently, they’ll have to think rather than just collecting taxes or whatever it is that they do. I’m sure they do more than that, delivering services. 

But the point is that if they could get this data, from Iveda but also from a lot of other companies like ourselves, our customers, but if you can get that data all into one place, then it could be that a city’s data on what’s happening, the data on what’s happening real time in a city in the future becomes one of the biggest monetization opportunities for a city. It used to be if you were on the coast, you had access to a port or you had a railway line coming from somewhere, those sorts of communications made all the difference about whether your city flourished or not, a good communication infrastructure. The London Underground as it expands means London expands. So transport became the driver of city growth. 

And what you’re talking about is, well, yes, transport will always be the driver, but efficient transport is even more important and leveraging the data that’s inherently there but it’s not being collected is actually going to be one of the biggest differentiators on the residents’ experience and the businesses’ experience within the city. So it’s a very exciting area. I know we can talk for a lot longer about this, but I think we’ve probably got to the end of our slot. 

David, it’s really exciting. I mean, if we go back to the beginning, you’ve had a hell of a journey, an amazing story, and as you said, you’re now on NASDAQ, you’re doing a lot of exciting stuff and there’s things cooking in the labs that you’ve hinted towards. But I think we can all resonate with what you’re talking about in our experience of cities, which isn’t always great, and the traffic experience certainly isn’t always great in these big cities with a lot of niche apps don’t really make our life easier in general. So I think what you’re doing is tremendous, and it’s a great example of IoT and technology and cloud and video and AI coming together to change people’s lives, which is where it really resonates as opposed to it just being a bunch of technologies that’s cool for the engineers. So it’s a great story. 

David Ly:

Absolutely, and thank you so much for summarizing like that, Nick. And yes, over the past 20 year span of our operation, you are right, again, we’re just obligated to find better ways of doing things, right? We had to face so much from so many different types of customers, and it would get very tiring if we were doing the same thing over and over trying to sell product. The goal is to build tools for organizations in the cities to manage and own their data. It’s one thing to collect, but what do you do with it? These tools make sense of the data that they’re collecting. 

And at the end of the day, we as citizens, coming back to we, the people, safety, security and convenience. If you provide me those three as a citizen, I continue paying my taxes. If you improve my experiences on all those three, I’m happy paying my taxes, right? At least you won’t complain. So that’s really the message and that’s Iveda’s goal is to really build the bright tools to make sense of all the data that we’re all going to get. Either way, whether we like it or not, we are living around packets floating all around us every day. I know I speak in terms packet, but it’s true, it’s floating all around every day so we got to make sense of it. 

Nick Earle:

Harness it to the benefit of everyone. Well, it’s wonderful, it’s a very exciting area. There’s many years of innovation ahead, and it’s something, as I say, we can all resonate with as individuals, as citizens, not just on the technical side of things. 

So David, let’s leave it there. Thank you again for being my guest on IoT Leaders. It was a really, really interesting podcast, and I wish you very well with Iveda, I’m sure we’re going to be hearing a lot more about you, maybe over here in Europe as well. I know you’re primarily in the US, but if you keep doing what you’re doing, I think we could be seeing you in some other parts of the world. We certainly need you in London where they’re just taxing the hell out of drivers right now trying to wage war against the car, which is becoming a big political issue. So some of the things you’ve talked about might be a more intelligent solution, but I think we’re going to have to wait a little while. But in the meantime, thank you so much for sharing not only your company story and the technology story, but your personal journey, which has been amazing. So thanks very much for being my guest on IoT Leaders Podcast. 

David Ly:

Thank you again for having me. And yes, I’d be delighted to join you in London one of these days. Yeah, look forward to it. 

Resources

Nobody does IoT better Let’s achieve your goals

Build the IoT estate that meets your needs now – and ten years from now. It’s why global leaders trust Eseye.