Anyone for sharing garbage?

By Steve Whalley, Chief Strategy Officer, MEMS Industry Group

At Sensors Expo June 9-11, it was encouraging to see many first timers at the pre-conference MEMS track hosted by MEMS Industry Group as well as at the two day main event.  Perhaps it was the Long Beach, CA location for this year’s event or just the very healthy double-digit growth rates in MEMS and sensors that brought upwards of 75% of new attendees to see what the excitement was all about.  Either way it was encouraging to feel the buzz in the sessions and exhibit hall for all three days.

While the event superbly covered a multitude of topics across the MEMS and sensor supply chain, applications, and adjacent technologies and ecosystems, I wanted to highlight just one important topic that a panel, expertly hosted by Mike Feibus, addressed in the MEMS pre-conference track.  The topic was the technical side of Big Data and its use.  It’s clear that as billions of sensors are now at the edge of constantly feeding the upstream data pipeline, there is much more to come.  With the era of a trillion sensors not that far away, we could be soon dealing with brontobytes (A brontobyte is 1024 yottabytes which is about 1.24 * 10^27 bytes) of data.

Some notable quotes from the Big Data panel reveal some challenges and opportunities our industry will need to deal with.  To highlight just three, they are:

1. “A data Tsunami carries a lot of garbage with it .”– Ian Chen, Freescale

Can we keep sending this deluge of data and its associated garbage up to the cloud?  According to some, the garbage could be as much as 99% of the content generated.  While it’s useful to do all the aggregation and processing of data in one place, it’s unlikely we can always afford the computation, storage and more importantly the bandwidth required and the round trip latency to keep pushing all this data to the cloud.  Clearly we need to get smarter at where the processing and aggregation takes place.  The Cloud, the Fog, the gateway, the application processor, the sensor hub and the sensor itself are all options along the way in a typical IOT environment.  The question becomes; do you know when, where and how you process the data and remove the garbage from further upstream travel with constraints such as cost, power, hardware and software footprint, security, latency, and processing power to name a few?  Will we need new system architectures to handle these complex questions in the rapidly approaching era?  Or is it more expert data scientists to program in advance languages like Hadoop to facilitate the digital data exhaust?

2. “Sharing data across the ecosystem will help accelerate best practices and solutions.” – Eduardo Pinheiro, Muzzley

The panel discussion around this comment was how do we foster growth and accelerated innovation in the industry by sharing data-sets, perhaps in an open source type environment?  As we currently collect data, both valuable and garbage, besides the issues noted in item 1 above, we also have questions around who owns the data and who owns the security and privacy of it.  And how do we respect the privacy of the people the data is related to while still delivering a simple user experience?  While many are willing to share their date freely today, others are not so forthcoming and this needs to be respected.  These questions make it challenging to just liberally share data from user to user to company to government institution for example.  Assuming there are applications and systems where answers are crystal clear on these topics, what opportunities would exist if the owner was willing to share that data-set with other clear owners of other synergistic data-sets too?  Do we need explicit permission from users to do so?  Could we avoid redundancy in collecting similar data-sets?  Could we improve device performance, power and accuracy by sharing certain data-sets from a world-wide perspective versus just one region within a country?  Could we ultimately reduce time to market and cost significantly?  Also, could we increase the accuracy of evaluation of security and terrorism threats by using Big Data?  And at what cost to ‘individual’ freedom whatever that may mean to you specifically?

3. “A venue is needed where we agree on how to tag data and share it.” – Ian Chen, Freescale

This comment is somewhat related to item 2 above in that if the data-sets are all characterized by a standardized tagging scheme, the sharing process should be technically much easier.  The question becomes is there an existing form of data tagging that could be readily adopted today and if not, what is the right forum to drive this topic to an industry agreed solution?  Perhaps we should also ask, does the industry agree that this is a relevant topic to even address? Maybe as a smaller but important step, we could agree on a basic protocol to help make Big Data a little more manageable at the edge and sensor node?

Next Steps – over to you

I have purposely posed questions arising from these comments on the panel versus express my opinions on how to address them.  The MEMS Industry Group and the Accelerated Innovation Community we currently use for sharing open source sensor algorithms could be a venue to begin addressing them but it’s not the only one.  I would like to hear from you on the relevance of these topics and your suggestions on how we should begin to tackle them.  Your feedback would be much appreciated. Please provide your feedback here: MEMS Industry Group LinkedIn community; http://memsindustrygroup.org/linkedin

Thank you.

The Age of Voice is Here

By Matt Crowley, CEO, Vesper
According to MIG Member Matt Crowley, CEO of Vesper, the voice user interface (VUI) is poised to replace touch as the primary user interface in smartphones, smartwatches and other consumer products – as well as in automotive infotainment systems and smart home IoT devices.

Technology market leaders such as Apple, Google, Amazon, Microsoft and Samsung are driving this migration to VUI and have made great improvements in voice interface data processing, algorithms and software. But hardware innovations have not kept pace, and a major advance in MEMS microphone performance is needed to make ubiquitous VUI a reality.

The MEMS microphone industry is the final piece of the puzzle, says Crowley. Read hisDesign News blog, UX Design: The Age of Voice Is Here, for the whole story.

What is Sensor Data Analytics?

Guest Blog by Freescale Semiconductor

Data analytics (DA) is the science of examining raw data with the purpose of uncovering insights and drawing conclusions. Businesses have applied data analytics to a consumer’s purchasing history, website browsing pattern, reading list, and movie preferences to better understand consumers’ needs and sharpen companies’ targeting efforts. As the industry continues to advance, analytics can and should be applied to raw sensor data, commonly referred to as sensor data analytics.

It is already common to create intuitive user interfaces that use sensors for navigation or to track user motion for health and to create intuitive user interfaces. However, sensor data is more than just monitoring motion, as algorithms analyze subtle variations in the data to uncover different layers of information. For example, a turning motor presents a vibration signature that can be recorded by an accelerometer. The vibration collates around a small number of frequencies with a given gear ratio and the energy pattern in the frequency domain remains relatively constant when the motor is healthy. As the motor wears out, a defective ball bearing or a slip in the gears will cause the motor to vibrate more vigorously with a broader vibration frequency spectrum. By detecting frequency content changes, sensor data can be analyzed to derive motor health from accelerometer data over time. Armed with this information, factory managers could schedule preventive maintenance more efficiently.

Applications for sensor data analytics are not limited to preventive maintenance and prognostics. Freescale engineers are working with key customers to use sensor data analytics to uncover and monetize information embedded in sensor data and create new classes of intelligent appliances. Here is a partial list of just a few interesting sensor data analytics applications:

  • Smart bed: using sensor data to monitor pulse rate and respiration, which can be used to improve sleep quality for a more productive morning.
  • Beer keg: using sensor data to determine how much beer is left in the keg and derive real-time beer consumption information, allowing bars and restaurants to order beer only when it’s needed and exactly when it’s needed, reducing spoilage.
  • Dosimeter: using sensor data to monitor how patients are administering the prescribed drugs for their chronic illness, ensuring a patient is adhering to the medication regime.
  • Safe power tools: using sensor data to recognize potential hazards while in use to avoid accidents.
  • Intelligent stove tops: using sensor data to notice the contents of a pot is boiling to turn down the heat without user intervention, reducing food waste and therefore costs for a commercial kitchen.
  • Fail-safe sump pumps: using sensor data to notice a failing sump pump to summon help automatically before the issue becomes serious.
  • Shower fixtures: wirelessly transmit hotel guest water usage data to a central accounting system – designed to “modify guest behavior.”
  • Tire pressure monitor system: provide preventive maintenance prognostics to improve vehicle safety and efficiency.
  • Swimming pool automation: sensors to monitor and control automatic sanitization, pH balancing, cleaning and filtration.

As the secure Internet of Things continues to grow, sensor data analytics will be a key component in driving new applications and helping companies monetize those applications by providing new benefits to consumers and companies. Our next postings will explore in more depth how sensor data analytics can be applied to products used in our daily lives.

Google Glass Needed Better Vision

By Karen Lightman, Executive Director, MEMS Industry Group

Alas, poor Google Glass, I never really got to know you. I wanted to, really. I was there when Google unveiled you at its Google I/O lovefest by parachuting on top of Moscone Convention Center with live video feed of the extravaganza. That was so cool! I learned from insiders and friends at Google that you were adorned with many MEMS (micro-electromechanical systems) and sensors – so of course, I loved that! I was so close to having you…and yet…

I could never rationalize the $2,000 starter price to walk around with another set of glasses in addition to the specs that I need to actually see. I could never accept the stigma attached to them — that I was another “Glasshole” who could track, record and digitize every aspect of my life — personal and professional. I was never comfortable with that level of intimacy with a wearable that was so un-wearable

I guess I was not alone in my decision not to don a pair of Google Glasses. And as announced recently by Google, the project was shelved and given for a reboot to the head of Nest, the smart thermostat folks recently purchased last year by Google for $3.2 billion.

So what explains the “epic fail” of Google Glass? Why the decision to drop the project within a week of Microsoft’s announcement of its virtual reality (VR) wearable glasses HoloLens? Will Microsoft’s HoloLens also fail? And what makes a successful wearable?

I know these are loaded questions and there are no definite answers. Let me begin by answering the last question in hopes that it will help shed light on the other two.

For a wearable to be successful, the user should not be aware that he/she is wearing it, according to my friend and human-computer interaction (HCI) expert, Mark DiPerri. The wearable should be seamless, powered efficiently and effectively and should enhance the user’s quality of life. Google Glass did all these things poorly, and that is why it failed.

That said, I have seen a successful use of smart glasses by Vuzix, who in partnership with enterprise software firm SAP, is marketing an augmented reality offering designed for industrial applications, and in particular, distribution centers. A video demonstration shows some exciting possibilities for a forklift operator.

Clearly, this is a good use of smart glasses and I wonder why Google didn’t directly address such a market. Maybe it was not sexy enough for them — but alas, sex appeal can only last so long. Eventually the honeymoon is over and you’re left with a pair of smart glasses that you don’t want to wear in public.

I wonder if designers, architects, and warehouse workers will ever use Microsoft’s HoloLens or will its appeal be limited to gamers glued to their Xboxes? I have heard the reviews of these babies are pretty solid, though I have yet to see a pair worn in public.

At the 2015 International CES, I saw some pretty awesome VR glasses from Virtuix that gave users a realistic feeling of being in the battlefield of a video game. But given that I am not a video gamer, these glasses still aren’t for me.

I can’t help but wonder what else a system like the Virtuix Omni can enable. Can’t we use these glasses for real-time on-the-job training for police, firefighters, paramedics and other rescue personnel? What about remote surgeries? What about teachers and students in an interactive classroom? Heck, maybe I can have a pair of smart glasses that can help me cook a decent meal without burning part of it. That would be a great quality of life improvement for my family!

Yes, we can push the envelope of where and how we use smart glasses. I am sure we will. Thanks to the MEMS and sensors inside them, along with sensor fusion algorithms and smart data analytics, we can and will create smart glasses that I’ll want and treasure. I am excited to see where the next, post-Google Glass generation of smart glasses will take us.

Originally posted on eetimes.

WiSpry and SkyeTek Release World’s First RF MEMS-Tuned UHF RFID Antenna

IRVINE, Calif.–(BUSINESS WIRE)–WiSpry, Inc., the leader in high-performance tunable radio frequency micromechanical (RF MEMS) semiconductor products for the wireless industry, today announced that SkyeTek, Inc. has leveraged the company’s adaptive antenna tuning technology to develop the first RF MEMS-enabled radio frequency identification (RFID) reader. The compact SkyeModuleTM Nova features WiSpry’s RF MEMS products, automatically correcting impedance mismatches between the reader and the antenna.

“This type of innovative thinking and partnering is at the root of our company. We are eager to see where SkyeTek – and other customers – apply our tunable RF technology next.”

Competing RFID readers have difficulties achieving maximum range with real world antennas and either consume additional RF energy or simply won’t work at full range. With the tunability and low loss of WiSpry RF MEMS, the SkyeModule Nova RFID reader optimizes the match between the antenna and reader without sacrificing power or range. This allows the system to run with more efficiency, generating less heat and saving energy even at long read ranges. The combination of efficiency and long read range will help RFID into new applications previously unthinkable. SkyeTek was able to design a revolutionary product for the RFID market by taking advantage of WiSpry’s low power, exceptionally high Q tunable RF MEMS-based products.

“The WiSpry MEMS Digitally Tunable Capacitor Array has allowed us to provide adaptive antenna tuning without significant reduction in module output power,” said Mark Matlin, Sr. RF Engineer, and antenna tuning project manager. Brad Alcorn, Director of Product Development, added, “The WiSpry part has allowed us to tune over a significant range of impedances with a single part. This would not be possible with any other device we are aware of.”

“SkyeTek has stepped outside the box to advance the field of UHF RFID, and we are very pleased that WiSpry technology enabled this significant advancement,” said Jeff Hilbert, CEO, president and founder of WiSpry, Inc. “This type of innovative thinking and partnering is at the root of our company. We are eager to see where SkyeTek – and other customers – apply our tunable RF technology next.”

About SkyeTek, Inc.

SkyeTek delivers RFID solutions for applications such as Inventory and Asset Management, Access Control, Product Authentication, and Patron Management. Oil and Gas, Healthcare, Retail and Hospitality industries all across the globe utilize SkyeTek solutions to improve efficiency and solve business problems. Contact Sales today to see how we can make RFID work for your application. SkyeTek is a privately held company headquartered in Denver, Colorado, with sales operations in North America, Europe and Asia. For more information, visit http://www.skyetek.com.

SkyeModuleTM Nova is a trade mark of SkyeTek, Inc.

About WiSpry

Headquartered in Irvine, Calif., WiSpry, Inc. provides tunable RF products that revolutionize wireless technology. The fabless RF semiconductor company utilizes microelectromechanical systems (MEMS) technology to design reconfigurable products, enabling wireless product and infrastructure manufacturers to support higher performance, next generation multi-band and multi-standard device and network architectures. Leveraging standard RF-CMOS process flows, WiSpry integrated components and modules deliver optimal flexibility and tunability without sacrificing performance. WiSpry is backed by a global group of investors specializing in innovative technology for growth markets. For more information, visit www.wispry.com.

WiSpry® and all related marks are property of WiSpry, Inc.

This article originally appeared on Business Wire.

CES 2015: Data Analytics Lend Wireless Sensors Power to Change Lives

By David Allan, President, Virtuix Inc.

Walking the aisles at CES, you are hard-pressed to find a single product that doesn’t contain at least one sensor. The latest iPhones add a barometric sensor to at least a dozen others. By some predictions, a trillion-sensor world is not far off. Yet what benefits, really, will this ubiquity of sensors deliver? We put this question, and others, to the speakers at the Sensors and MEMS Technology conference.

To Karen Lightman, Executive Director of the MEMS Industry Group, the answer lies in pairing sensors with data analytics. She notes that “MEMS and sensors are already fulfilling the promise to make the world a better place, from airbags and active rollover protection in our cars to the smart toaster that ensures my daughter’s morning bagel won’t be burnt. By combining sensors with data analytics, we can increase that intelligence exponentially.”

An example is biometric measurements, which traditionally suffer from undersampling. Your doctor checks your pulse or blood pressure just once in a while, whereas a typical day may see wild fluctuations. David He, Chief Scientist at Quanttus Inc., predicts a convergence between consumer and clinical use of wearable sensors. Noting that cardiovascular disease and other chronic conditions often go undiagnosed, he foresees ICU-quality wearable sensors that measure your vital signs as you undergo daily activities, relying on enormous datasets to detect problematic patterns. “While everyone is looking for the killer app in wearables,” he urges, “we should be looking for the un-killer app.”

Date analytics paired with ubiquitous sensors promise to improve and even save lives (Image courtesy of Quanttus Inc.)

Ben Waber, CEO of Sociometric Solutions, puts sensor data to a radically different use. His firm outfits employees of large companies with sensor-equipped badges that track their interactions. “In any industry the interaction between employees is the most important thing that happens at work,” he told CNN. His badges use motion sensors to follow users as they mix with others in the office and to monitor their posture while seated (slouching suggests low energy). A microphone measures tone of voice, rapidity of speech, and whether a person dominates meetings or allows others to speak in turn.

Waber claims employees can use the results to improve performance and job satisfaction. “You can see the top performers and change your behavior accordingly, to be happier and more productive. In a retail store, you might see that you spend 20% of your time talking to customers, but the guy who makes the most commission spends 30%.” He adds, “I can point to thousands of people who say they like their jobs better.”

Steven LeBoeuf, president of Valencell, points to a problem he calls “death by discharge,” meaning the tendency of novel wearables to “land in the sock drawer before insights can be made” because users tire of keeping them charged. His firm promotes a category he calls “hearables”: sensors added to earphones—powered from a standard jack—that measure pulse, breathing, blood pressure, and even blood-oxygen saturation, all from gossamer-thin vessels on the ear called “arterioles.” Yet measurements alone, he cautions, fall short without comparative analytics. “Human subject testing is a different animal altogether…extensive human subject validation is required for accurate biometric sensing.”

Data is moving from physical to mental. Rana el Kaliouby’s company, Affectiva, combines sensor data with analytics to monitor emotional states, detecting stress, loneliness, depression, and productivity. She foresees a sensor-driven “emotion economy” where devices act on our feelings. She told The New Yorker, “We put together a patent application for a system that could dynamically price advertising depending on how people responded to it.”

Indeed, patent filings abound for mood-sensing devices. Anheuser-Busch’s application for an “intelligent beverage container,” notes that without it, sports fans at games “wishing to use their beverage containers to express emotion are limited to, for example, raising a bottle to express solidarity with a team.”

Now stonily indifferent to our feelings, our devices may acquire an almost-human sympathy. “I think that, ten years down the line,” predicts Affectiva’s Kaliouby, “we won’t remember what it was like when we couldn’t just frown at our device, and our device would say, ‘Oh, you didn’t like that, did you?’”

MEMS the Word – January Newsletter

By Karen Lightman, Executive Director, MEMS Industry Group

Several years ago I coined the phrase “MEMS frickin’ everywhere.” I shared my vision for MEMS enabling a smarter and better world. This was before the term Internet of Things (IoT) had taken hold. My catchphrase got me into a bit of trouble with those offended by my use of a modified expletive as well as skeptics of the potential of MEMS.

Today that vision of MEMS everywhere seems passé and so obvious. That’s because the outlook for MEMS and sensors has never looked brighter – this was incredibly apparent to me at the 2015 International CES.

At this year’s CES, in addition to hosting the Sensors Marketplace on the show floor and hosting a booth with several of our member companies, MEMS Industry Group (MIG) hosted its third annual conference at CES. In 2013 we were invited by CEA to host a 1.5 hour conference; in 2014 it doubled to three hours and this year we filled an entire day of content plus a cocktail party. Some might say that MIG is growing as fast as the MEMS and sensors industry it represents!

Wearable Devices and the Search for the Holy Grail at 2015 International CES®

By Karen Lightman, Executive Director, MEMS Industry Group

Several years ago I coined the phrase “MEMS frickin’ everywhere.” I shared my vision for MEMS enabling a smarter and better world. This was before the term Internet of Things (IoT) had taken hold. My catchphrase got me into a bit of trouble with those offended by my use of a modified expletive as well as skeptics of the potential of MEMS.

Today that vision of MEMS everywhere seems passé and so obvious. That’s because the outlook for MEMS and sensors has never looked brighter – this was incredibly apparent to me at the 2015 International CES.

At this year’s CES, in addition to hosting the Sensors Marketplace on the show floor and hosting a booth with several of our member companies, MEMS Industry Group (MIG) hosted its third annual conference at CES. In 2013 we were invited by CEA to host a 1.5 hour conference; in 2014 it doubled to three hours and this year we filled an entire day of content plus a cocktail party. Some might say that MIG is growing as fast as the MEMS and sensors industry it represents!

2015 has already been heralded as the year of the wearable device and MIG chose wearables and the MEMS/sensors supply chain as the theme for our conference. We packed an impressive lineup of featured speakers and panelists. There have been several stories already posted by the press on the conference track as well as our exhibit so I won’t retell the already told. Instead I’d like to share with you my favorite quotes, moments and impressions from the entire show.

What’s my number one? Something that I’ve known for a while but now really believe is the HOLY GRAIL to both the future success of wearables and IoT/Everything: POWER.  Power reduction and management through sensor fusion, power generation through energy harvesting as well as basic battery longevity. It became very clear from conversations at the MIG conference as well as in talking with folks on the show floor that the issue of power is the biggest challenge and opportunity facing us now.

MIG’s recently announced Accelerated Innovation Community (AIC), an open source algorithm library for sensor fusion, is a good first step. AIC can help address the issue of sensor fusion to enable more powerful and power-efficient wearables and IoT/E. It has become clear that as an industry we’ve got to do more to address the issue of sensor fusion as well as power reduction, management and creation.  In order to be successful we need more folks onboard to participate in AIC as well as spread the word to end-users and integrators. Won’t you join our merry band of sensor fusion evangelists?

Favorite quote? It comes from David He, Chief Scientific Officer, Quanttus, when he described his company’s goal to find the “unkiller app” by enabling clinically accurate, contextual and continuous data that can empower people to truly take control of their health and yes, save lives. At our conference, David unveiled Quanttus’ never-seen-before health analytics that mapped the blood pressure of 200 people, which gave the audience a glimpse of the future described by Dr. Eric Topol in his book The Creative Destruction of Medicine. As someone who has been at the mercy of out-of-touch doctors who controlled my cancer treatment/healthcare, I welcome the day when I have a wearable device enabled by MEMS and sensors along with data analytics that gives me smart, useful and actionable data to help me guide and manage my own healthcare, thank you very much.

Lastly, being at CES this year reiterated my love and affection for MIG members. From the members who have been with MIG since its foundation in 2001 like Intel to our newest member, Virtuix (whose President joined MIG only minutes after speaking at MIG’s CES conference), MIG members totally rock. It was a pleasure and a delight to be in their company for one week, even at the world’s most insane tradeshow (because it’s in Vegas, after all).

MIG is a growing industry association in a growing industry. I’m confident that together, we can create a world that has MEMS and sensors frickin’ everywhere, but only if we continue to address the remaining challenges to commercialization. Won’t you join us?