EU's GDPR Could Cost Firms up to $20m (4% of Global Revenue)

February 5, 2016

Firms are spending tens to hundreds of millions building new data centers in the EU to comply with post safe-harbor regulations in order to avoid hefty fines; up to $20m USD or 4% of global revenues. Despite their best efforts, employees' unsanctioned use of cloud applications that contain personal data could still render companies liable.

Attack of the Ombudsperson

March 2, 2016

The draft document for ‘Safe Harbor 2.0’ was released on March 2, and is pending review and approval by the EU Article 29 Working Party by the end of March (sure). Sidley Austin’s Data Matters blog covers it well. In summary, the new framework is ‘significantly different’ from Safe Harbor 1.0 so companies must re-certify to “ensure a level of protection of personal data..."

Safe Harbor 2.0 and the reaction of cyber imperialists

March 7, 2016

Mr. Schrems has his doubts about 'Safe Harbor 2.0', according to his recent interview with Ars Technica. Others have been quick to jump on board with dissent, eyeing opportunities to become a neutral data haven. According to John Whelan, a data privacy lawyer, in an interview with the Irish site, “If Privacy Shield doesn't work out and ultimately data has to be segregated."

Rethinking IoT Security

March 17, 2016

With over 20 billion devices coming online by 2020 and an estimated 25 vulnerabilities per product, it's no wonder that IoT security is a hot topic. While acknowledging that encryption is not the complete answer, we maintain that data should be protected as it is created.

[Secure] Sharing is Caring

June 5, 2016

Keyword search is enabled on shared data by utilizing a key exchange system based on standard public and secret key cryptography. The _ultra encrypted key architecture allows applications to manage information in vulnerable cloud or on-premise environments while keeping sensitive data unreadable to the infrastructure provider and host.

Gone, Not Forgotten.

June 16, 2016

Strong privacy laws that establish the 'right to be forgotten' may be unenforceable. EU citizens can request that search engines remove results that are no longer relevant or accurate; however, researchers at NYU have found that even after links are delisted it is possible determine the names of individuals who petitioned for their removal.

Open Camps Conference at UN

July 12, 2016

The world's largest mission-driven open source conference, Open Camps aims to "break down barriers to technology innovation through open source governance, communities and collaboration." The Inpher team presented the _ultra development platform for application-level security and privacy at the Search Camp session in New York on July 10th.

Cloud Security by the Numbers

August 22, 2016

With over 3,000 IT professionals surveyed, the recent Ponemon study sponsored by Gemalto addressed issues concerning the "Global State of Cloud Data Security." The webcast can be viewed here and the report can be downloaded here. The participants represented a good cross section of company scale and geographic location around the world.

Confusion in China's Cyber Laws

November 8, 2016

The latest in a wave of sovereign data security laws has emerged from China, causing some alarm with companies trying to understand how it could impact their businesses. Several sectors are identified as "critical information infrastructure", including telecommunications, information services and finance, who would be required to store personal information and sensitive business data in China.

Behavioral Futures and Surveillance Capitalism

December 30, 2016

The inevitable onslaught of targeted advertisements has both consumers and technology companies wondering whether there is any alternative future for internet economics. Jonathan Shaw recently published a compelling piece in Harvard Magazine, breaking down some of the biggest challenges to our understanding of individual freedoms and technological progress.

Banking on [Digital] Trust

February 8, 2017

Trust cements the foundation of the banking industry. Without it, we would be more apt to keep cash stuffed under our mattresses than in the impenetrable vault of a stranger. Modern digital banking wins and maintains customers' trust based on the security, transparency and accessibility of their data. Unfortunately that trilogy is not always mutually inclusive.

US Senator Encourages Use of 'Privacy Enhancing Technologies'

May 23, 2017

In a letter to the Commission on Evidence-Based Policymaking, US Senator Ron Wyden, D-Ore. proposed the use of privacy enhancing technologies (PETs) by government agencies in order to protect sensitive data.

The Spectre of Hardware Security Looming over Intel SGX

January 16, 2018

As the fallout of the Spectre and Meltdown vulnerabilities settles, the future of in-silica security becomes fuzzier. There are many comprehensive reports on the attack vectors, patches and respective performance degradation, perhaps most lucidly presented by Peter Bright at Ars Technica.

Securing Autonomous Fleets with Global-Trained Localized Brains

March 18, 2018

It is possible to train machine learning models with private data sets so that no single data point is identified but statistical learning is maintained, including outliers like balls bouncing in the road or black ice conditions. Just like with humans, the more you experience, the more you know how to react in the future.

Predictions with Privacy for Patient Data

July 16, 2019

It has been proven that de-identified data points on anything from credit card transactions to healthcare records can be reidentified, often quickly, by trained data scientists with access to additional data points. A study conducted in 2000, for example, found that 87 percent of the U.S. population can be identified using a combination of their gender, birthdate and zip code.

An Alternative to Big Tech Breakup: Secure, Private Data-Sharing

February 7, 2020

Is there a good alternative to breaking up Big Tech? Yes, and the solution is secure, private data-sharing. With Secret Computing®, Big Tech companies could securely share their internal data with smaller competitors without ever sending or revealing their sensitive data in the process. This would effectively reduce Big Tech market power while maintaining the highest privacy standards.

A Call to Action: The Regulator’s Role in Supporting Privacy-Enhancing Technologies for Data-Driven Financial Crime Investigations

July 28, 2020

Combating financial crimes and protecting privacy are equally critical goals in modern society.

Financial crimes, when left undetected, can legitimize illegal profiting from systems of injustice on a global scale. Disregard for privacy can undermine an individual’s freedom to control how their personal information is used and disseminated. Risky data practices can further expose people to irreversible harms such as social profiling, identity theft, and data breaches.

Better ESG Benchmarking with Secret Computing®

October 20, 2020

As asset flows continue to pour into environmental, social and governance (ESG) factored investments, the problem of establishing reliable benchmarks continues to persist. Investors and corporate managers now have the ability to incorporate PETs and Secret Computing® to generate better benchmarking that aligns sustained value with financial performance.

The Privacy Risk Right Under Our Nose in Federated Learning

February 23, 2021

Many consumers use phones and IoT devices that rely on FL every day without knowing what it is. This means that the novelty of FL is shielding scrutiny on its inherent—and new—security vulnerabilities coming to light. While FL is a useful technique to aggregate information held by millions of compute nodes, it needs an extra layer of privacy-enhancing technology to ensure there is no leakage of personal information through “model parameters” that expose sensitive inferences at the individual level.

EU and U.S. Policymakers Emphasize Privacy-Enhancing Technologies as a Shared Priority in 2021

February 26, 2021

Many consumers use phones and IoT devices that rely on FL every day without knowing what it is. This means that the novelty of FL is shielding scrutiny on its inherent—and new—security vulnerabilities coming to light. While FL is a useful technique to aggregate information held by millions of compute nodes, it needs an extra layer of privacy-enhancing technology to ensure there is no leakage of personal information through “model parameters” that expose sensitive inferences at the individual level.

Building Privacy-Preserving Decision Tree Models Using Multi-Party Computation

December 7, 2021

Machine learning (ML) is increasingly important in a wide range of applications, including market forecasting, service personalization, voice and facial recognition, autonomous driving, health diagnostics, education, and security analytics. Because ML touches so many aspects of our lives, it’s of vital concern that ML systems protect the privacy of the data used to train them, the confidential queries submitted to them, and the confidential predictions they return. Privacy protection — and the protection of organizations’ intellectual property — motivates the study of privacy-preserving machine learning(PPML). In essence, the goal of PPML is to perform machine learning in a manner that does not reveal any unnecessary information about training data sets, queries, and predictions. This article shows how to address privacy challenges and use PPML in XGBoost training and prediction.

Neuroimaging Based Diagnosis of Alzheimer's Disease Using Privacy-Preserving Machine Learning (Part 1)

December 21, 2021

In this first of a three-part blog series, we present a viable solution designed by clinical researchers at CHUV and Inpher aiming to build privacy-preserving neuroimaging-based ML models of AD using Inpher’s XOR Platform. We specifically show how data providers can securely collaborate to build linear and logistic regression models that clinical and non-clinical researchers can use. With the rising hopes for efficient disease-modifying drugs, our clinically relevant concept of accurate ML-based diagnosis will help clinicians to efficiently stratify patients for clinical trials and finally deliver better care.

Neuroimaging Based Diagnosis of Alzheimer's Disease Using Privacy-Preserving Machine Learning (Part 2)

February 21, 2022

This second of the three-part blog series describes two privacy-preserving machine learning models (linear and logistic regression) for detecting Alzheimer's disease. The models help the researchers understand AD's underlying mechanism, and the clinicians detect AD early on. We detail the workflow needed for training with the data coming from multiple private sources (hospitals, radiology labs or research institutes) using a combination of SPM tools for local preprocessing and Inpher's XOR Platform for Privacy-Preserving Machine Learning.

Flying Fuzzy: A Privacy Preserved No-Fly List for Global Airlines Using Fuzzy String Matching

March 10, 2022

Since 2020, airlines have been dealing with a rash of irate, disruptive, and violent passengers, resulting in many of those travelers being banned from their future flights. Yet each airline has its own no-fly list in a format that is unique to its carrier. This blog post shows how with Inpher’s XOR Platform, FAA and airlines can securely collaborate on identifying the no-fly passengers while protecting the traveler’s privacy and civil liberties.

How DataCo is Reinventing Data Partnerships by Putting Privacy-Enhancing Technologies to Work

June 5, 2022

Think of all the great customer experiences you have ever had! Those unexpected business class upgrades or Netflix “on-the-house” as a part of your cell phone plans. What’s behind them all? Seamless customer intelligence. One such company helping organizations accomplish this is DataCo Technologies, a startup that emerged out of ANZ Bank’s VC and innovation arm, 1835i. DataCo Collaboration Platform built with Inpher’s Secret Computing® technology allows organizations to connect and collaborate with data partners, leveraging the latest Privacy Enhancing and data protection to remove the need for data to ever be shared directly.

XOR Hybrid Private Computing in Microsoft Azure

June 21, 2023

With expanding privacy regulations, data transfer restrictions and sophisticated cyber attacks on the rise, organizations face unprecedented challenges to store, process and share sensitive data.

A Winning Go-to-Market Starts with Unification: From CMO to CRO

August 7, 2023

In today's hyper-competitive business landscape, the success of a company heavily relies on the seamless integration and alignment of various functions and in my case, much-needed alignment between sales and marketing.

Empowering Women Leaders: The Impact of Women in High-Tech

August 15, 2023

As a junior professional, venturing into the high-tech space has been a truly unique and eye-opening journey filled with brilliant encounters.

Privacy Budget and the Data-Driven Enterprise

September 18, 2023

Recent developments in machine learning and artificial intelligence have highlighted the benefits data exploration can bring to organizations across industries.

Privacy Budget: A Roadmap to Privacy Preserving Data Collaboration

September 21, 2023

In the world of data collaboration, it is generally understood that when two or more parties share training data for their AI models, they achieve more accurate predictions while mitigating bias because they are learning from deeper and more diverse data points.

The Seven Foundational Principles of Privacy by Design

September 26, 2023

First introduced in 1995, privacy by design evolved alongside privacy-enhancing technologies (PETs) and is intended to address the broader systems and processes in which PETs operate.

Privacy Budget in Support of Privacy by Design

September 28, 2023

In this blog, we explore privacy budget in support of privacy by design for meeting global privacy regulations.