"Here is how we make money from nVIDIA customers: The first nVIDIA platform which Nexoptic targeted is Jetson, so what is Jetson anyways? Jetson is the “AllSpark” of the Autobots. Its main use is in Embedded Stand Alone devices, and offers scalability from a tiny two-watt micro version which could be used to manicure your toenails or put your contacts on in the morning, to a full blown computer ideal for autonomous machines like delivery and logistics robots, factory systems, and large industrial UAVs (unmanned aerial vehicles ie. drones). Here is an excerpt from the nVIDIA 2020 Annual Report: NVIDIA AI for Inference Someday, trillions of connected devices will perceive their environment, infer a response, and do seemingly smart things. We will simply tell these devices what we need. Recent breakthroughs are making this future possible sooner rather than later. https://s22.q4cdn.com/364334381/files/doc_financials/2020/ar… That’s all very cool and sci-fi-like, but as a NexOptic investor how can I be sure that I’ll make money this year from Jetson sales? For these (countless) devices to accurately perceive, they need to see clearly. Their ability to infer a response and not mess up, requires accurate vision. Image Sensors roll off assembly lines with imperfections acceptably within tolerance. As these autonomous devices move around, the images/video feed is infected by the environment, which creates distractions to what the device is trying to figure out what is around it. Low light, noise, blur, glare, all give false indications of what is out there. This is where ALIIS comes in. As we all know, ALIIS learns the imperfections of the image sensor in a microsecond and proceeds to clean the image / video feed via its API with a few lines of code. ALIIS’ patent pending AI tech removes the glare, noise, blur, etc, in real time, so why would companies like MVTech, or any of the other thousands of companies which build on Jetson devices, turn down an opportunity which costs pennies per feature and leave their ability to do smart (or dumb) things based on messy unpredictable vision. If I wanted to build the next Roomba, or automatic lawn mower, and I wanted to minimize law suits, I would choose Jetson and I wouldn’t quibble over the price of adding ALIIS to my design. https://www.mvtec.com/products/embedded-vision/all-products/… Every company that uses Jetson together with an image sensor does their “thing” with what it sees, and they want to do that “thing” the best in the market so it can grow its customer base. Being an “nVIDIA Preferred Member”immediately brings ALIIS to the forefront when their engineers are looking to improve their product. If you were Sparks (image below), you would definitely need ALIIS. If you were an Amazon delivery drone, it would be good to know that thing coming towards you is the home owner you are authorized to deliver to, not the porch pirate. So, once Kevin Gordon has mastered Jetson, what is next? Well, Jetson is based on the Tegra processor, and runs on the nVIDIA AGX platform. The brothers of Jetson on the AGX platform are CLARA and DRIVE. According to the nVIDIA web, CLARA is used for medical instruments, and DRIVE is for Self Driving Vehicles. I don’t think it takes much engineering to modify the ALIIS API to work on the CLARA and DRIVE platforms of nVIDIA products. The prior NXO news release talked about Medical Devices specifically… hmmmm. ALIIS will prove to be the cash cow of NexOptic, it will fund and springboard other initiatives, and many of us will be retiring on our NexOptic dividend cheques. When we break through $3.76 I will not sell what have, I will be buying more stock. A couple of weeks ago, when NXO was trading at 36 cents, I suggested it was the perfect time to buy based on chart technical analysis. Recently I laid out my opinion of the price per share based on 50 and 70 P/E for each million in earnings once revenue is declared. My best advice for anyone thinking about adding here, is to do your own due diligence: research Jetson, SnapDragon, nVIDIA and Qualcomm. I did that and I feel very excited to add here and now, while it is still below a buck."
|