Search
 Coin Explorers
Search
 Coin Explorers

Portfolio

Markets

Project Reviews

Founder Stories

Features

Guides

News

Videos

Let’s stay in touch:

News

Selling Sensitive Data on IOTA Data Marketplace

Since the quarantine and while working on my graduation project, I discovered the IOTA Data Marketplace platform. And I thought why are the data sources so scarce? Why do corporations and different…

Oct 5, 2020 · 6 min read
  • Share on X
  • Share on Facebook
  • Share on Linkedin
Selling Sensitive Data on IOTA Data Marketplace

Selling Sensitive Data on IOTA Data Marketplace Since the quarantine and while working on my graduation project, I discovered the IOTA Data Marketplace platform. And I thought why are the data sources so scarce? Why do corporations and different institutions want to keep their data a secret, although it has been proved that sharing data for the purpose of studying it by data analysts is beneficial for everyone in society? Most importantly, how can we encourage these corporations to share their data? So we need a way to gather sensitive data from different sources while keeping data integrity and without exposing information to unauthorized individuals. The Problem Institutions put a lot of effort and resources to gather data, and at the same time, they don't like to share their own data because they fear some competitors could use it and implement a counter-strategy against them. For example, factories wouldn’t share data about their products’ supply chain or production lines. Same for hospitals, they prefer not to share their patient’s data as they are private. In order to solve this problem, we need a way to share data in an anonymous and secure way. If we overcome this problem, we will have a rich source of trusted data that helps in implementing public policies...etc. The Solution In this article, I will present two cases: the first case is institutions who work in the same field therefore they have similar information and face the same challenges. Like healthcare, financial and factorizing institutions. So we are going to filter data to make it anonymous then merge all similar data with each other. The second case consists of institutions who work in unique filed and usually, they don’t like to share their data. Therefore we are going to offer some operations that users can implement on data without accessing them. First Case We use a DataFilter to implement a filtering process for data to encapsulate/hide all information that refers to a specific institution. Then, a do a merging process for all similar institutions data. Use Case examples For example, if we have three factories who want to share their data in an anonymous way. All factories have their sensors and IoT devices to share data. Data Incapsulation Block gathers data then encapsulates and merges it and finally publish it to the IOTA data marketplace to be used by data analysts. This example could be implemented for hospitals or any other institutions in the same field. We can implement this diagram on a smaller scale below. Let’s Implement A Practical Example To elaborate my idea, I made a simple Github gist. I tried to implement it as simple as possible. We assume we have our IoT devices inside an institution and we deploy data to the data marketplace and our prototype work as a middleware between them to filter data. The IoT devices send JSON Objects to our Github gist. According to the above flowchart, I am going to create a simple application to filter data, merge it then push and download filtred JSON object. Let’s Code It took me only 30 minutes to make this example. I used IOTA documentation to make my code clean & clear as much as possible. What am I going to create exactly? A function that takes as a tail transaction, category and data as a parameter something like this. //dataset to be filtredconst inputData = {'name':'Patient X','id':'YYYYY','age':23,'institution_name':'institution X','institution_id':'IDIDIDID','Result':'Important Info To save','dates':'Important dates to save them','Medicines':'Important medicines to take'}// 1.tail transcation to check for old data// 2. Data category// 3. Data to be filtred main('DWAOSLKHKMC9AQKYAL9PQSISXNDAAJGYXIPYTZVZHQCZGZAIPTVRN9OFFFZIISYIQOTWQWPI9WHGJN999','hospitals',inputData); We need to include our npm modules and initialize IOTA node // IOTA globalconst Iota = require('@iota/core');const Extract = require('@iota/extract-json');const Converter = require('@iota/converter');// Connect to a nodeconst iota = Iota.composeAPI({provider: 'https://nodes.devnet.iota.org:443'});const depth = 3;const minimumWeightMagnitude = 9;// Define a seed and an address.// These do not need to belong to anyone or have IOTA tokens.// They must only contain a mamximum of 81 trytes// or 90 trytes with a valid checksumconst address ='HEQLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWOR99D';const seed ='PUEOTSEITFEVEWCWBTSIZM9NKRGJEIMXTULBACGFRQK9IMGICLBKW9TTEVSDQMGWKBXPVCBMMCXWMNPDX'; Filter and migrate function will be implemented in a naive way for this example. const filterAndMerge = (oldDataSet,dataset)=>{const filter = ['name','id','age','institution_name','institution_id'];for (let index = 0; index < filter.length; index++) {delete dataset[filter[index]];}let newData = {};if (Object.keys(oldDataSet).length) {newData = Object.assign(oldDataSet, dataset);}return newData;} Fetch and Push data function exactly as the documentation. I just replaced .then() with await //IOTA metodsconst fetch = async(tailTransactionHash)=>{// Get the transaction objects in the bundleconst bundle = await iota.getBundle(tailTransactionHash)// Extract and parse the JSON messages from the transactions' `signatureMessageFragment` fieldsconst data = JSON.parse(Extract.extractJson(bundle));return data;}const pushData = async(dataCategory,filtredDataset)=>{const message = JSON.stringify({dataCategory:dataCategory, dataset:filtredDataset});// Convert the message to trytesconst messageInTrytes = Converter.asciiToTrytes(message);// Define a zero-value transaction object// that sends the message to the addressconst transfers = [{value: 0,address: address,message: messageInTrytes}];// Create a bundle from the `transfers` array// and send the transaction to the nodeconst trytes = await iota.prepareTransfers(seed, transfers);const bundle = await iota.sendTrytes(trytes, depth, minimumWeightMagnitude);return bundle[0].hash;} Finally, we are going to create the main function. Main function steps are the same as the flowchart above. const main = async(tailTransaction,category,data)=>{// check if this data category already exisitconst oldData = await fetch(tailTransaction);let newData;if (oldData.dataCategory === category) {newData = filterAndMerge(oldData,data);}else{newData = filterAndMerge({},data);}console.log(newData)const hash = await pushData(newData);console.log(hash);return hash;} And that now you should have something similar to that: // IOTA globalconst Iota = require('@iota/core');const Extract = require('@iota/extract-json');const Converter = require('@iota/converter');// Connect to a nodeconst iota = Iota.composeAPI({provider: 'https://nodes.devnet.iota.org:443'});const depth = 3;const minimumWeightMagnitude = 9;// Define a seed and an address.// These do not need to belong to anyone or have IOTA tokens.// They must only contain a mamximum of 81 trytes// or 90 trytes with a valid checksumconst address ='HEQLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWORLDHELLOWOR99D';const seed ='PUEOTSEITFEVEWCWBTSIZM9NKRGJEIMXTULBACGFRQK9IMGICLBKW9TTEVSDQMGWKBXPVCBMMCXWMNPDX';//IOTA metodsconst fetch = async(tailTransactionHash)=>{// Get the transaction objects in the bundleconst bundle = await iota.getBundle(tailTransactionHash)// Extract and parse the JSON messages from the transactions' `signatureMessageFragment` fieldsconst data = JSON.parse(Extract.extractJson(bundle));return data;}const pushData = async(dataCategory,filtredDataset)=>{const message = JSON.stringify({dataCategory:dataCategory, dataset:filtredDataset});// Convert the message to trytesconst messageInTrytes = Converter.asciiToTrytes(message);// Define a zero-value transaction object// that sends the message to the addressconst transfers = [{value: 0,address: address,message: messageInTrytes}];// Create a bundle from the `transfers` array// and send the transaction to the nodeconst trytes = await iota.prepareTransfers(seed, transfers);const bundle = await iota.sendTrytes(trytes, depth, minimumWeightMagnitude);return bundle[0].hash;}const filterAndMerge = (oldDataSet,dataset)=>{const filter = ['name','id','age','institution_name','institution_id'];for (let index = 0; index < filter.length; index++) {delete dataset[filter[index]];}let newData = {};if (Object.keys(oldDataSet).length) {newData = Object.assign(oldDataSet, dataset);}return newData;}const main = async(tailTransaction,category,data)=>{// check if this data category already exisitconst oldData = await fetch(tailTransaction);let newData;if (oldData.dataCategory === category) {newData = filterAndMerge(oldData,data);}else{newData = filterAndMerge({},data);}console.log(newData)const hash = await pushData(newData);console.log(hash);return hash;}const inputData = {'name':'Patient X','id':'YYYYY','age':23,'institution_name':'institution X','institution_id':'IDIDIDID','Result':'Important Info To save','dates':'Important dates to save them','Medicines':'Important medicines to take'}main('DWAOSLKHKMC9AQKYAL9PQSISXNDAAJGYXIPYTZVZHQCZGZAIPTVRN9OFFFZIISYIQOTWQWPI9WHGJN999','hospitals',inputData); Let run the code and we will get an output like that node index.js { dataCategory: 'hospitals', dataset: '{pateientA:patientB,patientC:patientD}', Result: 'Important Info To save', dates: 'Important dates to save them', Medicines: 'Important medicines to take' }HEDNKESCWPFNVGJBE99XNRZHHSYTRNZYRZXSVWHLQASANSWVKONYRNB9N9HCMHFIZIARDGARMKUCZS999 Second case In my opinion, the second case is more interesting so I am going to focus on it in a separate article. Some data may be useless or risk exposing the identity of the institutions if we used DataFilter on them. So we are going to use a BlackBox where data is kept secured inside it. Then, we can implement chosen operations inside the BlackBox to get the output. It will be something like that. What To Do Next? I am planning to implement this project on a large scale. To create a plugin for the IOTA data marketplace to encourage data sharing in a secure way. This is why I wrote this article, I want to know how many people are interested in data integrity. If I started an open-source project will the community be interested to contribute? I created a proposal for full implementation. If anyone is interested I can share it with her/him. The next steps will be: I used to implement IPFS on all of my blockchain projects as decentralized storage. Recently, IOTA launched Proof Of Existence (POE) and I am planning to try it. It could be useful for verification of filtered data. Filtering & migration processes should keep data anonymous while sorting datasets in a better way to analyze them. We may need some algorithms to achieve that. Authentication & Authorization for institutions. Integration with IOTA data marketplace. An external layer of security should be added if institutions don’t want to share their data at all. Or, If the datasets will not be useful if we make it anonymous. So we upload machine learning model and make operations inside a BlackBox(confidential data included inside it) and we share the results only. I am welcoming anybody who interested in the open-source world to join me. If you are interested you can contact me by email at [email protected] Also, Read Get Best Software Deals Directly In Your Inbox


  • Share on X
  • Share on Facebook
  • Share on Linkedin

Related News

Bitcoin has officially entered the Guinness World Records for a number of entries, the first of which is being recognized as the First Decentralized Cryptocurrency
News

Bitcoin has officially entered the Guinness World Records for a number of entries, the first of which is being recognized as the First Decentralized Cryptocurrency

Bitcoin now has multiple entries in the Guinness Book of World Records, including most valuable and the first decentralized cryptocurrency.

Oct 19, 2022

740 Million in Bitcoin exits exchanges, the biggest outflow since June's BTC price crash
News

740 Million in Bitcoin exits exchanges, the biggest outflow since June's BTC price crash

The technical outlook, however, remains bearish for Bitcoin, with the price eyeing a run-down toward $14,000 in Q4/2022.

Oct 18, 2022

Bitcoin Wins the Guinness World Record for First Decentralized Cryptocurrency
News

Bitcoin Wins the Guinness World Record for First Decentralized Cryptocurrency

Bitcoin has been honored as the oldest and most valuable crypto, while El Salvador is recognized as the first country to adopt it as legal tender. 

Oct 18, 2022

 Coin Explorers

PortfolioMarketsProject ReviewsFounder StoriesFeaturesGuidesNewsVideosTerms & ConditionsPrivacy Policy

Powered by

 Coin Explorers

Copyright © 2025 - All Rights Reserved