Connect with us

Hi, what are you looking for?

Editor's Pick

Wiliot builds its Physical AI supply chain platform on Databricks to operationalize item-level IoT data

Wiliot builds its Physical AI supply chain platform on Databricks to operationalize item-level IoT data

Wiliot builds its Physical AI supply chain platform on Databricks to operationalize item-level IoT data

By Marc Kavinsky, Lead Editor at IoT Business News.

Wiliot has partnered with Databricks to run its Physical AI platform and supply chain automation solutions on the Databricks platform, aiming to make large volumes of item-level sensor data easier for enterprises to ingest, govern, and use for operational decisions.

For all the talk about AI in operations, supply chains still run into a stubborn bottleneck: most physical-world signals are either sampled sporadically (via scans and audits) or trapped in siloed systems that are hard to join with enterprise data. Item-level visibility can generate enormous data volumes, but turning that firehose into governed, reusable inputs for analytics and automation is where many initiatives slow down.

Wiliot’s latest move is aimed squarely at that “data-to-decision” gap. The company says it will run its Physical AI platform and its supply chain automation solutions on the Databricks platform, formalizing the relationship as a Databricks Built-On partner. In practical terms, Wiliot is positioning Databricks’ lakehouse architecture as the underlying data layer for ingesting and analyzing real-time streams generated by Wiliot’s battery-free IoT Pixels.

What changes when Physical AI sits on a lakehouse

Wiliot’s proposition is built around item-level sensing: postage-stamp-sized, battery-free Bluetooth sensors (“IoT Pixels”) that turn products and assets into data sources, feeding what the company calls a continuous stream of granular signals across supply chain and retail environments. The challenge for enterprise teams is less about collecting a pilot dataset and more about sustaining pipelines that can accommodate high-frequency event data while still meeting governance and security expectations.

By placing its platform on Databricks, Wiliot is effectively aligning itself with a data engineering environment already used for large-scale analytics and AI workloads. Wiliot says the combined setup uses Databricks compute, data handling, and storage to ingest real-time streams from Wiliot Physical AI networks, and then apply Wiliot’s analytics logic for outcomes such as predicting disruptions, automating inventory management, and optimizing cold chain logistics.

The distinct angle here is that this is not a generic “AI partnership” announcement centered on model building. Wiliot is making the data architecture decision the headline: Physical AI is being framed as an operational data product—physical-world events unified with enterprise data sources—rather than a standalone IoT cloud that exports dashboards. That matters because, for many enterprises, the hard part is not creating another operational screen; it is making sensor data usable across multiple teams and systems without re-integration every time a new use case appears.

Where Wiliot expects customers to feel the impact

Wiliot ties the Databricks foundation to five solutions it already sells: inventory intelligence, automated receiving, automated shipment verification, reusable asset tracking, and temperature monitoring. These use cases share a common requirement: they depend on reliable, near-real-time event processing at different physical “handoff points” (receiving docks, storage zones, yard movements, outbound doors, in-transit condition changes).

A key implication—one not stated explicitly but evident from the architecture—is that Wiliot is optimizing for enterprises that want to treat physical-event data as a first-class dataset within broader analytics programs. If Physical AI events land in the same governed environment as other operational and business datasets, it becomes easier to operationalize them beyond the immediate supply chain team—without building point-to-point exports for every stakeholder. In their announcement Wiliot specifically calls out making insights accessible across business units such as operations, logistics, merchandising, and sustainability teams.

Databricks, for its part, emphasizes unifying physical-world data with enterprise data, and highlights retail-focused outcomes like reducing out-of-stocks and shrink and improving store experiences through converged signals such as location and temperature.

Broader industry relevance: IoT data is moving closer to enterprise AI stacks

This partnership is part of a larger shift in IoT: as enterprises consolidate analytics and AI workflows, IoT platforms increasingly need to “fit into” enterprise data stacks rather than sit alongside them. Item-level sensing pushes that requirement even further because data volumes and event velocity can be high, and because the value often depends on joining sensor events with reference data (products, locations, shipments) and operational context (orders, exceptions, compliance processes).

Wiliot’s approach also reflects a reality in supply chain automation: outcomes like scan-free receiving or automated shipment verification are not only about device connectivity—they depend on data reliability, identity resolution, and governance so that events can be trusted for workflow actions. Positioning Databricks as the underlying platform is a bet that customers want those foundations in a familiar, enterprise-grade environment.

What OEMs, integrators, and enterprises should take away

For enterprises already invested in Databricks, Wiliot’s Built-On positioning may reduce friction in operationalizing Wiliot-generated data alongside existing data engineering, governance, and AI practices. That can shorten the path from “we have sensor signals” to “we can use them in multiple applications,” particularly where different teams need access under a unified governance model.

For system integrators, the announcement suggests a clearer landing zone for Wiliot event streams: a lakehouse-centric pipeline rather than a closed analytics endpoint. That can make it easier to design cross-domain solutions where item-level visibility informs not just supply chain workflows but adjacent analytics initiatives.

And for connectivity and IoT ecosystem players, the message is that Physical AI is increasingly being sold as an enterprise data capability—where the differentiator is not simply the sensor, but the ability to continuously translate physical events into governed, reusable data products that plug into modern AI and analytics stacks.

The post Wiliot builds its Physical AI supply chain platform on Databricks to operationalize item-level IoT data appeared first on IoT Business News.

Enter Your Information Below To Receive Latest News, And Articles.

    Your information is secure and your privacy is protected. By opting in you agree to receive emails from us. Remember that you can opt-out any time, we hate spam too!

    You May Also Like

    Editor's Pick

    By Marc Kavinsky, Lead Editor at IoT Business News. Industrial safety teams are increasingly trying to consolidate gas detection, lone worker protection and communications...

    Editor's Pick

    When 5G arrived, it came with extraordinary promises: multi-gigabit speeds, sub-millisecond latency, and the capacity to connect billions of devices simultaneously. For smartphones and...

    Editor's Pick

    By Marc Kavinsky, Lead Editor at IoT Business News. Smart street lighting projects often stall on a familiar obstacle: getting devices connected consistently across...

    Latest News

    Senate Democrats sent over their latest proposal for immigration enforcement changes at the Department of Homeland Security as a shutdown of the vast department...