An AI music platform pivoting to data centers is planning to deploy solar-powered GPUs in the parking lots of sites owned by a medical real estate firm.
Auddia Inc. this week announced that LT350 has signed a non-binding Letter of Intent (LOI) with an unnamed NYSE-listed medical REIT to host LT350’s first pilot installation, expected to be at a hospital property in the Dallas-Fort Worth area.
The LOI outlines plans to collaborate on deploying LT350’s first solar-integrated, parking-lot-based AI micro-data center canopy. LT350’s designs integrate modular GPU, memory, and battery storage cartridges directly into the ceiling of its proprietary solar canopy.
Auddia, which is set to merge with LT350, said the canopy enables high-performance AI compute to be deployed above existing parking lots without absorbing parking spaces or requiring new land acquisition.
“Healthcare is one of the most latency-sensitive and data security-intensive environments for AI inference,” said Jeff Thramann, CEO of Auddia and founder of LT350. “We believe this LOI represents a meaningful validation of LT350’s potential to deliver secure, high-performance, on-premise inference compute directly adjacent to clinical operations.”
“We view this pilot as the first step in a broader strategy to bring distributed AI infrastructure to healthcare campuses nationwide,” Thramann added. “Hospitals and medical facilities are among the highest-value inference environments, and we believe LT350 is uniquely positioned to serve them.”
Nasdaq-listed Auddia offers AI-powered music apps, including one offering radio and podcast streaming services. It is merging with Thramann’s holding company, which includes ownership of LT350. Upon closing of the transaction, Auddia and the merged companies will be renamed McCarthy Finney, with Auddia, LT350, and two health-related units sitting under the new parent company.
LT350 is described in SEC filings as a platform infrastructure company leveraging a proprietary solar parking lot canopy that integrates modular plug & play cartridges into the ceiling of the canopies.
Its cloud infrastructure cartridges house the servers and GPUs, battery storage cartridges house batteries and provide grid services to local utilities, smart invertor cartridges deploy solar energy to the GPUs and batteries in the canopies or to the grid, and EV charging cartridges house the components to charge EVs. Its website suggests each GPU unit will feature eight AI accelerators and offer up to 10kW of liquid-cooled capacity. Up to three GPU units will be within each GPU cartridge, suggesting 24 accelerators and 30kW of capacity.
“Hyperscalers built the training layer,” said Thramann in a previous announcement. “LT350 is building the distributed inference layer — one that we believe will be faster to deploy, cheaper to operate, and dramatically more energy efficient, while generating premium revenue for premium inference compute services.”
The unnamed medical REIT owns and manages approximately 200 medical facilities across the United States, including hospitals, ambulatory surgery centers, and medical office buildings. If the pilot is successful, LT350 expects to expand across the medical REIT’s broader portfolio.
Solar panels over parking lots – known as canopy solar – has been deployed at a number of data center locations. Digital Realty has done this in South Africa and Switzerland.
Last year, Belgian startup Tonomia announced it was working with UK hardware provider Panchaea to offer a distributed AI platform housed in solar canopy systems in parking lots, known as eCloud.
The company aims to combine servers into its existing eParking Solar Modules alongside its lithium or sodium-ion battery systems. Tonomia says its canopies can generate 600W each in Europe and 750W in the US (due to the larger parking spaces).
Read the orginal article: https://www.datacenterdynamics.com/en/news/parking-lot-solar-gpu-canopy-deployment-planned-in-texas/









