Discover more from PATENT DROP
#034 PATENT DROP - Ford, Nvidia & more
Self-landing military planes, cars that don't stink, and live inside photos
This Patent Drop is going out to 9,188 people! Hit subscribe to get a peek into the future with 3 summaries of new patent applications from big tech companies every week ✨🔮
Hi, happy Monday!
This week’s Patent Drop focuses on some companies I don’t usually write about, in spaces I know less about. Let me know what you think! And if you have any friends or colleagues that are interested in the future, please forward them this email.
Before I get into this week’s filings, I also wanted to tell you about a new product called Inspo - launched by a Patent Drop reader.
Inspo is a search engine for inspiration - it's like "Google for creativity" and can help you get better ideas, faster around any topic, trend or marketing event.
For example, when I searched for “Patents”, its AI gave me these thought-starters that could inspire some creative campaigns for Patent Drop, such as:
1. Patents protect the body of an idea, but not the soul.
2. Patents are about the emotion of creating something new, and not having it taken away.
3. A patent is to patience what base camp is to the final summit.
Inspo is being used by companies such as Dentsu for content creation, strategy, campaigns, ads, moment marketing and other creative projects.
If you want to give it a try, all Patent Drop readers are getting 1 month free to Inspo Pro - just use the promo code PATENT:
ps: if you work in marketing / strategy / vc and want an intro to the team, just hit reply
Okay cool, let’s get into this week’s filings!
First time feature of a defense company on Patent Drop.
This patent application looks at how Lockheed Martin’s autonomous aircraft can evaluate different potential landing zones.
To decide where to land, Lockheed Martin’s aircraft will integrate real-time cellular data to detect the presence of people in a potential landing area. More specifically, the aircraft will calculate cell phone density, which is the amount of cell phones in a particular area.
The real-time cell phone data will be used in conjunction with sensor data, such as LADAR (laser detection and ranging) sensors and LIDAR (light detection and ranging) sensors, as well as existing world maps that contain data around terrain and any obstacles (man-made or natural).
The aircraft will be configured to have an acceptability criteria for determining which landing zones are acceptable and which aren’t. For instance, there would a threshold value for cell phone density that would make a landing space deemed safe.
Why is this interesting? At least to me, I always find it fascinating how the world is awash with different data sources and what complex activities the processing of that data can help enable - in this example, cellular data helping military aircraft land autonomously.
Beautiful and scary.
Funny and curious.
According to Ford, a key differentiator in the future mobility market is odour control - especially in a world of shared vehicles (e.g. robo-taxis),
Odour mitigation is pretty complex when you think about it. Firstly, odour is mostly a psychological response and is difficult to map to the chemistry that produces the odours. Secondly, responses to odour is complex. It can vary based on cultural, demographic, geographic or other backgrounds. Lastly, the response to odours is time dependent. An odour that is gradually increased may cause no response to humans.
In this filing, Ford mention a number of potential odour management systems.
One relies on an e-nose (electrochemical nose) that contains a number of sensors to identify the presence of potential smells in the air. To mitigate any odours, the vehicle will rely on devices that either add, move or remove air from a vehicle interior. For example, window controllers, HVAC systems, filtration systems or air freshener systems.
Another system relies on human noses. In this example, the car might ask a human to describe a new smell detected in the air and whether they like it or not. The chemical fingerprint of the smell would then be stored, alongside the description of it and whether it’s considered good or bad. The system would then figure out the best odour mitigation strategy for that smell, and store that learning for the next time it encounters that smell.
In a world with shared car ownership and self-driving fleets of Taxis, Fords wants to make sure the future doesn’t stink.
This patent application is short but interesting, and points towards Nvidia wanting to bridge the existing digital world into a VR-friendly world.
Virtual Reality allows users to be able to have a 360-degree full view of a scene. However, most of our cameras and images aren’t panoramic, and therefore can’t be fully enjoyed in VR yet.
In this filing, Nvidia want to use generative-adversarial-networks (GANs) to take standard images and turn them into panoramic images.
At first glance, this might seem pretty mundane.
But in reality, this filing begins to turn every existing form of 2d media into a scene that people can enter in VR. At present, these generated scenes may just be static, panoramic images. But GANs could eventually be used to animate the characters, and generate appropriate sound for the scene. AI could be used to animate the people in the image and turn them into characters that you can interact with. VR may not just be about generating future worlds but jumping back into historic worlds.
Before you go
Join 200,000 investors following Patent Drop on public.com and get weekly tldr’s of patent drop for publicly traded companies
Have any feedback, ideas or collaborations for Patent Drop? Hit reply :)