Enabling AI-driven health advances without sacrificing patient privacy

Theres a lot of excitement at the intersection of artificial intelligence and health care. AI has already been used to better disease treatment and detection find promising new drugs unite links between genes and diseases and more.

By analyzing big datasets and finding patterns virtually any new algorithm has the practicable to help resigneds — AI investigationers just need approach to the right data to train and test those algorithms. Hospitals understandably are hesitant to share sentient resigned information with investigation teams. When they do share data its hard to establish that investigationers are only using the data they need and deleting it behind theyre done.

Secure AI Labs (SAIL) is addressing those problems with a technology that lets AI algorithms run on encrypted datasets that never leave the data owners method. Health care organizations can control how their datasets are used while investigationers can defend the confidentiality of their standards and search queries. Neither party needs to see the data or the standard to collaborate.

SAILs platform can also combine data from multiple sources creating rich insights that fuel more facultyful algorithms.

’You shouldnt have to schmooze with hospital executives for five years precedently you can run your machine learning algorithm’ says SAIL co-founder and MIT Professor Manolis Kellis who co-founded the company with CEO Anne Kim 16 SM 17. ’Our goal is to help resigneds to help machine learning scientists and to form new therapeutics. We want new algorithms — the best algorithms — to be applied to the biggest practicable data set.’

SAIL has already associateed with hospitals and life science companies to unlock anonymized data for investigationers. In the next year the company hopes to be working with almost half of the top 50 academic medical centers in the country.

Unleashing AIs full practicable

As an undergraduate at MIT studying computer science and molecular biology Kim worked with investigationers in the Computer Science and Artificial Intelligence Laboratory (CSAIL) to analyze data from clinical trials gene union studies hospital intensive care units and more.

’I realized there is something severely broken in data sharing whether it was hospitals using hard drives ancient file convey protocol or even sending stuff in the mail’ Kim says. ’It was all just not well-tracked.’

Kellis who is also a limb of the Broad Institute of MIT and Harvard has spent years establishing associateships with hospitals and consortia athwart a range of diseases including cancers core disease schizophrenia and fatness. He knew that littleer investigation teams would struggle to get approach to the same data his lab was working with.

In 2017 Kellis and Kim determined to commercialize technology they were developing to allow AI algorithms to run on encrypted data.

In the summer of 2018 Kim shared in the delta v startup accelerator run by the Martin Trust Center for MIT Entrepreneurship. The founders also accepted support from the Sandbox Innovation Fund and the Venture Mentoring Service and made different soon connections through their MIT network.

To share in SAILs program hospitals and other health care organizations make parts of their data useful to investigationers by setting up a node behind their firewall. SAIL then sends encrypted algorithms to the servers where the datasets reside in a process named federated learning. The algorithms crunch the data locally in each server and transmit the results back to a mediate standard which updates itself. No one — not the investigationers the data owners or even SAIL —has approach to the standards or the datasets.

The approach allows a much broader set of investigationers to adduce their standards to big datasets. To further promise the investigation aggregation Kellis lab at MIT has begun holding competitions in which it gives approach to datasets in areas like protein office and gene countenance and challenges investigationers to prophesy results.

’We ask machine learning investigationers to come and train on last years data and prophesy this years data’ says Kellis. ’If we see theres a new type of algorithm that is performing best in these aggregation-level assessments nation can assume it locally at many different institutions and level the playing field. So the only thing that matters is the condition of your algorithm rather than the faculty of your connections.’

By enabling a big number of datasets to be anonymized into aggregate insights SAILs technology also allows investigationers to study rare diseases in which little pools of appropriate resigned data are frequently extend out among many institutions. That has historically made the data hard to adduce AI standards to.

’Were hoping that all of these datasets will eventually be open’ Kellis says. ’We can cut athwart all the silos and empower a new era where see resigned with see rare disorder athwart the whole globe can come unitedly in a one keystroke to analyze data.’

Enabling the remedy of the forthcoming

To work with big amounts of data almost specific diseases SAIL has increasingly sought to associate with resigned unions and consortia of health care clusters including an interpolitical health care consulting company and the Kidney Cancer Association. The associateships also align SAIL with resigneds the cluster theyre most trying to help.

Overall the founders are lucky to see SAIL solving problems they faced in their labs for investigationers almost the globe.

’The right locate to explain this is not an academic project. The right locate to explain this is in activity where we can prepare a platform not just for my lab but for any investigationer’ Kellis says. ’Its almost creating an ecomethod of academia investigationers pharma biotech and hospital associates. I ponder its the blending all of these different areas that will make that vision of remedy of the forthcoming befit a verity.’