journal

EU's GDPR Could Cost Firms up to $20m (4% of Global Revenue)

February 5, 2016

Firms are spending tens to hundreds of millions building new data centers in the EU to comply with post safe-harbor regulations in order to avoid hefty fines; up to $20m USD or 4% of global revenues. Despite their best efforts, employees' unsanctioned use of cloud applications that contain personal data could still render companies liable.

Attack of the Ombudsperson

March 2, 2016

The draft document for ‘Safe Harbor 2.0’ was released on March 2, and is pending review and approval by the EU Article 29 Working Party by the end of March (sure). Sidley Austin’s Data Matters blog covers it well. In summary, the new framework is ‘significantly different’ from Safe Harbor 1.0 so companies must re-certify to “ensure a level of protection of personal data..."

Safe Harbor 2.0 and the reaction of cyber imperialists

March 7, 2016

Mr. Schrems has his doubts about 'Safe Harbor 2.0', according to his recent interview with Ars Technica. Others have been quick to jump on board with dissent, eyeing opportunities to become a neutral data haven. According to John Whelan, a data privacy lawyer, in an interview with the Irish site independent.ie, “If Privacy Shield doesn't work out and ultimately data has to be segregated."

Rethinking IoT Security

March 17, 2016

With over 20 billion devices coming online by 2020 and an estimated 25 vulnerabilities per product, it's no wonder that IoT security is a hot topic. While acknowledging that encryption is not the complete answer, we maintain that data should be protected as it is created.

[Secure] Sharing is Caring

June 5, 2016

Keyword search is enabled on shared data by utilizing a key exchange system based on standard public and secret key cryptography. The _ultra encrypted key architecture allows applications to manage information in vulnerable cloud or on-premise environments while keeping sensitive data unreadable to the infrastructure provider and host.

Gone, Not Forgotten.

June 16, 2016

Strong privacy laws that establish the 'right to be forgotten' may be unenforceable. EU citizens can request that search engines remove results that are no longer relevant or accurate; however, researchers at NYU have found that even after links are delisted it is possible determine the names of individuals who petitioned for their removal.

Open Camps Conference at UN

July 12, 2016

The world's largest mission-driven open source conference, Open Camps aims to "break down barriers to technology innovation through open source governance, communities and collaboration." The Inpher team presented the _ultra development platform for application-level security and privacy at the Search Camp session in New York on July 10th.

Cloud Security by the Numbers

August 22, 2016

With over 3,000 IT professionals surveyed, the recent Ponemon study sponsored by Gemalto addressed issues concerning the "Global State of Cloud Data Security." The webcast can be viewed here and the report can be downloaded here. The participants represented a good cross section of company scale and geographic location around the world.

Confusion in China's Cyber Laws

November 8, 2016

The latest in a wave of sovereign data security laws has emerged from China, causing some alarm with companies trying to understand how it could impact their businesses. Several sectors are identified as "critical information infrastructure", including telecommunications, information services and finance, who would be required to store personal information and sensitive business data in China.

Behavioral Futures and Surveillance Capitalism

December 30, 2016

The inevitable onslaught of targeted advertisements has both consumers and technology companies wondering whether there is any alternative future for internet economics. Jonathan Shaw recently published a compelling piece in Harvard Magazine, breaking down some of the biggest challenges to our understanding of individual freedoms and technological progress.

Banking on [Digital] Trust

February 8, 2017

Trust cements the foundation of the banking industry. Without it, we would be more apt to keep cash stuffed under our mattresses than in the impenetrable vault of a stranger. Modern digital banking wins and maintains customers' trust based on the security, transparency and accessibility of their data. Unfortunately that trilogy is not always mutually inclusive.

US Senator Encourages Use of 'Privacy Enhancing Technologies'

May 23, 2017

In a letter to the Commission on Evidence-Based Policymaking, US Senator Ron Wyden, D-Ore. proposed the use of privacy enhancing technologies (PETs) by government agencies in order to protect sensitive data.

The Spectre of Hardware Security Looming over Intel SGX

January 16, 2018

As the fallout of the Spectre and Meltdown vulnerabilities settles, the future of in-silica security becomes fuzzier. There are many comprehensive reports on the attack vectors, patches and respective performance degradation, perhaps most lucidly presented by Peter Bright at Ars Technica.

Securing Autonomous Fleets with Global-Trained Localized Brains

March 18, 2018

It is possible to train machine learning models with private data sets so that no single data point is identified but statistical learning is maintained, including outliers like balls bouncing in the road or black ice conditions. Just like with humans, the more you experience, the more you know how to react in the future.

Predictions with Privacy for Patient Data

July 16, 2019

It has been proven that de-identified data points on anything from credit card transactions to healthcare records can be reidentified, often quickly, by trained data scientists with access to additional data points. A study conducted in 2000, for example, found that 87 percent of the U.S. population can be identified using a combination of their gender, birthdate and zip code.

An Alternative to Big Tech Breakup: Secure, Private Data-Sharing

February 7, 2020

Is there a good alternative to breaking up Big Tech? Yes, and the solution is secure, private data-sharing. With Secret Computing®, Big Tech companies could securely share their internal data with smaller competitors without ever sending or revealing their sensitive data in the process. This would effectively reduce Big Tech market power while maintaining the highest privacy standards.

A Call to Action: The Regulator’s Role in Supporting Privacy-Enhancing Technologies for Data-Driven Financial Crime Investigations

July 28, 2020

Combating financial crimes and protecting privacy are equally critical goals in modern society.

Financial crimes, when left undetected, can legitimize illegal profiting from systems of injustice on a global scale. Disregard for privacy can undermine an individual’s freedom to control how their personal information is used and disseminated. Risky data practices can further expose people to irreversible harms such as social profiling, identity theft, and data breaches.

Better ESG Benchmarking with Secret Computing®

October 20, 2020

As asset flows continue to pour into environmental, social and governance (ESG) factored investments, the problem of establishing reliable benchmarks continues to persist. Investors and corporate managers now have the ability to incorporate PETs and Secret Computing® to generate better benchmarking that aligns sustained value with financial performance.

The Privacy Risk Right Under Our Nose in Federated Learning

February 23, 2021

Many consumers use phones and IoT devices that rely on FL every day without knowing what it is. This means that the novelty of FL is shielding scrutiny on its inherent—and new—security vulnerabilities coming to light. While FL is a useful technique to aggregate information held by millions of compute nodes, it needs an extra layer of privacy-enhancing technology to ensure there is no leakage of personal information through “model parameters” that expose sensitive inferences at the individual level.

EU and U.S. Policymakers Emphasize Privacy-Enhancing Technologies as a Shared Priority in 2021

February 26, 2021

Many consumers use phones and IoT devices that rely on FL every day without knowing what it is. This means that the novelty of FL is shielding scrutiny on its inherent—and new—security vulnerabilities coming to light. While FL is a useful technique to aggregate information held by millions of compute nodes, it needs an extra layer of privacy-enhancing technology to ensure there is no leakage of personal information through “model parameters” that expose sensitive inferences at the individual level.

Building Privacy-Preserving Decision Tree Models Using Multi-Party Computation

December 7, 2021

Machine learning (ML) is increasingly important in a wide range of applications, including market forecasting, service personalization, voice and facial recognition, autonomous driving, health diagnostics, education, and security analytics. Because ML touches so many aspects of our lives, it’s of vital concern that ML systems protect the privacy of the data used to train them, the confidential queries submitted to them, and the confidential predictions they return. Privacy protection — and the protection of organizations’ intellectual property — motivates the study of privacy-preserving machine learning(PPML). In essence, the goal of PPML is to perform machine learning in a manner that does not reveal any unnecessary information about training data sets, queries, and predictions. This article shows how to address privacy challenges and use PPML in XGBoost training and prediction.

Neuroimaging Based Diagnosis of Alzheimer's Disease Using Privacy-Preserving Machine Learning (Part 1)

December 21, 2021

In this first of a three-part blog series, we present a viable solution designed by clinical researchers at CHUV and Inpher aiming to build privacy-preserving neuroimaging-based ML models of AD using Inpher’s XOR Platform. We specifically show how data providers can securely collaborate to build linear and logistic regression models that clinical and non-clinical researchers can use. With the rising hopes for efficient disease-modifying drugs, our clinically relevant concept of accurate ML-based diagnosis will help clinicians to efficiently stratify patients for clinical trials and finally deliver better care.

Neuroimaging Based Diagnosis of Alzheimer's Disease Using Privacy-Preserving Machine Learning (Part 2)

February 21, 2022

This second of the three-part blog series describes two privacy-preserving machine learning models (linear and logistic regression) for detecting Alzheimer's disease. The models help the researchers understand AD's underlying mechanism, and the clinicians detect AD early on. We detail the workflow needed for training with the data coming from multiple private sources (hospitals, radiology labs or research institutes) using a combination of SPM tools for local preprocessing and Inpher's XOR Platform for Privacy-Preserving Machine Learning.

Flying Fuzzy: A Privacy Preserved No-Fly List for Global Airlines Using Fuzzy String Matching

March 10, 2022

Since 2020, airlines have been dealing with a rash of irate, disruptive, and violent passengers, resulting in many of those travelers being banned from their future flights. Yet each airline has its own no-fly list in a format that is unique to its carrier. This blog post shows how with Inpher’s XOR Platform, FAA and airlines can securely collaborate on identifying the no-fly passengers while protecting the traveler’s privacy and civil liberties.

How DataCo is Reinventing Data Partnerships by Putting Privacy-Enhancing Technologies to Work

June 5, 2022

Think of all the great customer experiences you have ever had! Those unexpected business class upgrades or Netflix “on-the-house” as a part of your cell phone plans. What’s behind them all? Seamless customer intelligence. One such company helping organizations accomplish this is DataCo Technologies, a startup that emerged out of ANZ Bank’s VC and innovation arm, 1835i. DataCo Collaboration Platform built with Inpher’s Secret Computing® technology allows organizations to connect and collaborate with data partners, leveraging the latest Privacy Enhancing and data protection to remove the need for data to ever be shared directly.

XOR Hybrid Private Computing in Microsoft Azure

June 21, 2023

With expanding privacy regulations, data transfer restrictions and sophisticated cyber attacks on the rise, organizations face unprecedented challenges to store, process and share sensitive data.

A Winning Go-to-Market Starts with Unification: From CMO to CRO

August 7, 2023

In today's hyper-competitive business landscape, the success of a company heavily relies on the seamless integration and alignment of various functions and in my case, much-needed alignment between sales and marketing.

Empowering Women Leaders: The Impact of Women in High-Tech

August 15, 2023

As a junior professional, venturing into the high-tech space has been a truly unique and eye-opening journey filled with brilliant encounters.

Privacy Budget and the Data-Driven Enterprise

September 18, 2023

Recent developments in machine learning and artificial intelligence have highlighted the benefits data exploration can bring to organizations across industries.

Privacy Budget: A Roadmap to Privacy Preserving Data Collaboration

September 21, 2023

In the world of data collaboration, it is generally understood that when two or more parties share training data for their AI models, they achieve more accurate predictions while mitigating bias because they are learning from deeper and more diverse data points.

The Seven Foundational Principles of Privacy by Design

September 26, 2023

First introduced in 1995, privacy by design evolved alongside privacy-enhancing technologies (PETs) and is intended to address the broader systems and processes in which PETs operate.

Privacy Budget in Support of Privacy by Design

September 28, 2023

As of late the concept of privacy by design has been formalized in data protection laws globally. Having evolved as a way to consider the broader systems and processes in which privacy enhancing technologies (PETs) operate, privacy by design can be further supported by our concept of a privacy budget, which should be allocated in any data-driven project that uses PETs. In this blog, we explore privacy budget in support of privacy by design for meeting global privacy regulations.

Moore4Medical Accelerates Patient Monitoring for Improved Outcomes with Inpher

October 2, 2023

Moore4Medical is a European project led by Philips, comprised of a total of 65 partners, including universities, research institutes, hospitals, and private companies.

AI, Data and the Privacy Gap: Institutionalizing Governance within the Data Driven Enterprise

October 12, 2023

AI systems have an insatiable appetite for data. Rapid advancements mean they are now capable of massive ingestion and use of almost all publicly available data–so increasingly, those working on a variety of AI applications are setting their sights on higher value, more sensitive and private data assets.

Governance and Privacy-Enhancing Technologies: Why Every Enterprise Needs to Adopt a PET

October 16, 2023

Privacy-Enhancing Technologies (PETs) represent a revolutionary capability that facilitates a delicate equilibrium between privacy and utility within information systems.

Privacy-Preserving Model Explainability: What It Is & How Data Influences A Model

November 9, 2023

Explainable AI can help organizations align with ethical and regulatory imperatives in order to unleash the potential of their data, including private data.

The Rise of Financial Fraud in the Digital Era and the Role of Explainable AI

November 14, 2023

In a world of escalating financial fraud, Explainable AI has the potential to make powerful, ML-based fraud detection a realistic option that aligns with regulations, ethics, and the need for accuracy.

SHAP Values In Support of Forecasting

November 20, 2023

Given their ability to generate clear explanations while preserving accuracy, SHAP values have large potential to support Explainable AI. By bringing machine learning technologies to the mainstream, SHAP values can enable companies to more fully take advantage of their valuable data.

Balancing Data, Privacy, and Business Learnings

November 21, 2023

For large retailers like Walmart, leveraging data strategically and in a privacy preserving way can produce powerful results and even more powerful actionable outcomes.

Inpher, Oracle & Sensitive Data Workloads: Generating Business Value from Stored Data

November 29, 2023

Customers rely on Oracle to manage their most sensitive data workloads. Inpher empowers organizations to generate value by operating collaboratively on the data found in Oracle databases.

Balancing Privacy and Explainable AI in Semiconductor Manufacturing

November 30, 2023

Given their ability to generate clear explanations while preserving accuracy, SHAP values have large potential to support Explainable AI. By bringing machine learning technologies to the mainstream, SHAP values can enable companies to more fully take advantage of their valuable data.

Business AI Preparedness- The Race to Leverage Secure AI

December 19, 2023

Businesses do not have to throw caution (and security and privacy) to the wind in their hurry to start benefiting from LLMs.

The Privacy-Utility Trade Off with Generative AI

January 22, 2024

Security technology must keep pace with the development of LLMs. A way out of the privacy-utility trade off is to secure users’ inputs.

Break Down Data Silos with Cryptographic Security Using Inpher on Oracle Cloud

January 23, 2024

Inpher’s integration with Oracle Cloud represents a significant advancement in secure and private cloud computing. This collaboration enhances data security, making it an ideal choice for organizations with data housed across multiple clouds and on-premises environments.

Advancing Risk Mitigation for Hedge Funds While Unlocking Investment Power via Generative AI

January 31, 2024

Accessing, sharing, and serving key learnings from datasets enables hedge funds to increase their productivity, predictability, and overall customer satisfaction.

GenAI in DevOps Community - The Privacy Utility Trade-off for Ethical and Secure Machine Learning Operations

February 6, 2024

In this blog post, we will explore the growing trend of how coders are leveraging GenAI, examine key usage statistics, and identify the coder communities at the forefront of this technological evolution, as well as the need for ethical and responsible AI in practice.

Insurance Fraud Detection: The Role of AI and Data Collaboration

February 13, 2024

In this post, we will discuss how new technologies can help in the fight against insurance fraud, and why data – and data privacy – are at the heart of these efforts.

Unlocking Privacy-Preserving AI: Transforming Auto Insurance Fraud Detection

February 20, 2024

In this post, we discuss how data collaboration and privacy-preserving AI can enhance auto insurance fraud detection

Hedging your PETs - Hedge Fund Market Optimization via Privacy Preserving Technologies

February 27, 2024

In the evolving landscape of hedge funds, leveraging data with AI must go hand-in-hand with prioritizing security and privacy.

Inpher XOR DataFrame API - Advancing Privacy-Preserving Cryptographic Workflows

March 5, 2024

XOR DataFrame API is a new way for data analysts to select input datasets, chain operations, and run them on the XOR Platform. This new API offers a leaner interface that is more intuitive to use.

Empowering Women: Advocating for Equality Beyond International Women's Day

March 8, 2024

On this International Women's Day, let's reflect on our past experiences and share our stories. Let's advocate for change, fostering an environment where women can speak freely and assertively.

Satellite Collision, Space Debris and PETs - How Privacy-Preserving Technologies can Advance Space Situational Awareness

March 12, 2024

In this blog post, we explore the global challenges of effectively sharing and analyzing space situational awareness (SSA) data for the purpose of satellite collision avoidance, and how Privacy-Enhancing Technologies (PETs) advance the sharing of sensitive data to help mitigate such risks.

The Ethical AI Revolution - How an AI Strategy Can Help with Data Utility and its Impact on Human Value (Part 1)

March 19, 2024

(Part 1 of 3) Ethical AI refers to the development and deployment of artificial intelligence systems that prioritize ethical considerations, such as fairness, transparency, accountability, privacy, and safety.

The Ethical AI Revolution - How an AI Strategy Can Help with Data Utility and its Impact on Human Value (Part 2)

March 26, 2024

(Part 2 of 3) Data utility in the context of ethical AI refers to the value or usefulness of data for achieving a specific goal while considering ethical principles and constraints.