- 2017, August 22, Irish Times, “TDs fear new data protection rules will hamper constituency work“
- 2017, August 22, Hot For Security, “Cryptocurrency doesn’t guarantee financial privacy, researchers find“
- 2017, August 22, Cointelegraph, “Why Blockchain Alone Cannot Fix Privacy Issue” — indeed. No technology is totally autonomous. At some stage regulation and oversight is needed. Permissioned Blockchains where nodes are vetted – and compliance is monitored – may work.
- 2017, August 21, IAPP, “Late out of the gate: Companies lagging on GDPR’s controller accommodation requirement“
- 2017, August 21, CloudFlare, “Advancing Privacy Protection with the GDPR“
- 2017, August 21, Inc, “The Most Important Cybersecurity News Story You Missed This Month“
- 2017, August 21, MIT, “Using machine learning to improve patient care“
- 2017, August 21, The Mandarin, “RBA eyes digital identity dividend” — so in a sense we end up with the Australia Card anyway!
- 2017, August 21, Signal, “Seeing Is Believing For Artificial Intelligence“
- 2017, August 21, ZDNet, “Gartner sets fire to all the cyber things“
- 2017, August 21, ITNews, “Fujitsu declares Sydney SAN crash a ‘major incident’” — you would think a power outage due to a thunderstorm would be a high risk a major data centre would want to mitigate? Fujitsu are going to have some painful compensation costs for all those large organisations paying big bucks for a DR site there.
- 2017, August 21, SSRN, “Building Sustainable Free Legal Advisory Systems: Experiences from the History of AI & Law” — lawyers will be comforted to know the difficulty in automating legal decision-making and expertise!
- 2017, August 21, SSRN, “Computer as Confidant: Digital Investment Advice and the Fiduciary Standard” — I guess the reality is that, given enough data, the computer will be able to comply with the ‘Know You Customer’ rule! Maybe know the customer a little too well…
- 2017, August 21, SSRN, “Feeding the Machine: Policing, Crime Data, & Algorithms” — interesting perspective. I think this paper’s approach is to critique algorithmic decision-making through a social constructionist lens. The models reflect how police construct the idea of ‘crime’; which is then, more broadly, a social construction. It’s great to see more humanities ideas integrated with more computational ideas.
- 2017, August 21, SSRN, “Insurers, Big Data, and Policyholder Statements” — nothing amazing in this paper. My impression is that it is only saying that insurers can use big data to build a picture of a ‘normal’ policyholder and then use that analysis to detect fraud or address outliers (to improve pricing models, I guess).
- 2017, August 20, NZ Privacy Commissioner, “Benchmarking against international privacy peers“
- 2017, August 18, Oxford University, “Corporate Governance for Complex Cryptocurrencies“
- 2017, August 17, arXiv – 1708.05665, “Untangling Blockchain: A Data Processing View of Blockchain Systems” — I’m not sure why an academic paper needs to start with a sensationalist first line?! “Blockchain technologies are taking the world by storm, largely due to the success of Bitcoin.” I agree with the sentiment though. The paper quickly gets into the technical weeds after that flowery first paragraph.
Interesting Find: “On My Disk“: it looks like a user-friendly way to attach a high-volume disk (including one inside a home desktop computer) to your home network and open it up to use remotely to share files. Not a bad idea, but good luck getting users to follow good security practice on their home network to avoid getting hacked!
The Forgotten (Data) Generation
I suspect that people will go overboard with requests to erase their personal data. Organisations and governments may want to extract as much analytic value from their current datasets before May next year. I don’t think this mass erasure will be a long-term trend, but I think its impact will be noticable.
A potential solution will be to pass more control to the individual to license their data – perhaps for a micro-payment (cue Blockchain). This may not work because people will get lazy and data will get outdated and unusable.
Incentives are needed to both: protect data (on the controller’s and processor’s side), and consent to use the data (on the subject’s side). It may take some time and experimentation to get those incentives right. I don’t think it will be a market-led solution either.