slides/slides08.tex
changeset 142 14d4e839e13a
parent 141 12729536bfa2
child 143 5d6c0e3b4ebb
equal deleted inserted replaced
141:12729536bfa2 142:14d4e839e13a
   238 
   238 
   239 \begin{itemize}
   239 \begin{itemize}
   240 \item Apple takes note of every dictation (send over the Internet to Apple)
   240 \item Apple takes note of every dictation (send over the Internet to Apple)
   241 \item markets often only work, if data is restricted (to build trust)
   241 \item markets often only work, if data is restricted (to build trust)
   242 \item Social network can reveal data about you 
   242 \item Social network can reveal data about you 
   243 \item have you tried the collusion extension for FireFox?
   243 \item have you tried the collusion (lightbeam?) extension for FireFox?
   244 \item I do use Dropbox, store cards
   244 \item I do use Dropbox, store cards
   245 \end{itemize}
   245 \end{itemize}
   246 
   246 
   247 \begin{textblock}{5}(12,9.9)
   247 \begin{textblock}{5}(12,9.9)
   248 \includegraphics[scale=0.2]{pics/gattaca.jpg}\\
   248 \includegraphics[scale=0.2]{pics/gattaca.jpg}\\
   455 to identify whether an individual was part of the study --- DB closed in 2008) 
   455 to identify whether an individual was part of the study --- DB closed in 2008) 
   456 \end{itemize}}
   456 \end{itemize}}
   457 
   457 
   458 \end{frame}}
   458 \end{frame}}
   459 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   459 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
       
   460 
       
   461 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
       
   462 \mode<presentation>{
       
   463 \begin{frame}<2>[c]
       
   464 \frametitle{We cannot exclude all Harm}
       
   465 
       
   466 \begin{itemize}
       
   467 \item Analysis of a given data set teaches us that smoking causes cancer. 
       
   468 Mary, a smoker, is harmed by this analysis: her insurance premiums rise. 
       
   469 Mary’s premiums rise whether or not her data are in the data set. In other words, 
       
   470 Mary is harmed by the finding “smoking causes cancer.\bigskip
       
   471 
       
   472 \item \ldots of course she is also helped, she might quit smoking
       
   473 \end{itemize}
       
   474 
       
   475 \end{frame}}
       
   476 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   460      
   477      
   461 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   478 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   462 \mode<presentation>{
   479 \mode<presentation>{
   463 \begin{frame}<2>[c]
   480 \begin{frame}<2>[c]
   464 \frametitle{Differential Privacy}
   481 \frametitle{Differential Privacy}
   465 
   482 
   466 \begin{itemize}
   483 \begin{itemize}
   467 \item Goal: Nothing about an individual should be learnable from the database that 
   484 \item Goal: Nothing about an individual should be learnable from the database that 
   468 cannot be learned without access to the database.\pause\bigskip
   485 cannot be learned without access to the database.\pause\bigskip
   469 
   486 
   470 \item Differential privacy is a protocol which you run on some dataset \bl{$X$} producing
   487 \item Differential privacy is a ``protocol'' which you run on some dataset \bl{$X$} producing
   471 some output \bl{$O(X)$}.
   488 some output \bl{$O(X)$}.\bigskip
       
   489 
       
   490 \item You want to achieve \alert{forward privacy}
   472 \end{itemize}
   491 \end{itemize}
   473 
   492 
   474 \end{frame}}
   493 \end{frame}}
   475 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   494 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   476 
   495 
   559 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   578 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   560 
   579 
   561 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   580 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   562 \mode<presentation>{
   581 \mode<presentation>{
   563 \begin{frame}[t]
   582 \begin{frame}[t]
   564 \frametitle{\begin{tabular}{@{}c@{}}Tor\end{tabular}}
   583 \frametitle{\begin{tabular}{@{}c@{}}Tor (private web browsing)\end{tabular}}
   565 
   584 
   566 \begin{itemize}
   585 \begin{itemize}
   567 \item initially developed by US Navy Labs, but then opened up to the world 
   586 \item initially developed by US Navy Labs, but then opened up to the world 
   568 \item network of proxy nodes
   587 \item network of proxy nodes
   569 \item a Tor client establishes a ``random'' path to the destination server (you cannot trace back where the information came from)\bigskip\pause
   588 \item a Tor client establishes a ``random'' path to the destination server (you cannot trace back where the information came from)\bigskip\pause
   586 \end{itemize}}
   605 \end{itemize}}
   587 
   606 
   588 
   607 
   589 \end{frame}}
   608 \end{frame}}
   590 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
   609 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
       
   610 
       
   611 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
       
   612 \mode<presentation>{
       
   613 \begin{frame}[c]
       
   614 \frametitle{Tor Nodes}
       
   615 
       
   616 Dan Egerstad wrote:\bigskip
       
   617 
       
   618 \it ``If you actually look in to where these Tor nodes are hosted and how big they are, some of these nodes cost thousands of dollars each month just to host because they're using lots of bandwidth, they're heavy-duty servers and so on. Who would pay for this and be anonymous?" 
       
   619 
       
   620 
       
   621 \end{frame}}
       
   622 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% 
       
   623 
       
   624 
       
   625 
       
   626 
   591 
   627 
   592 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   628 %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
   593 \mode<presentation>{
   629 \mode<presentation>{
   594 \begin{frame}[t]
   630 \begin{frame}[t]
   595 \frametitle{\begin{tabular}{@{}c@{}}Skype\end{tabular}}
   631 \frametitle{\begin{tabular}{@{}c@{}}Skype\end{tabular}}
   615 \begin{frame}[c]
   651 \begin{frame}[c]
   616 \frametitle{\begin{tabular}{@{}c@{}}Take Home Point\end{tabular}}
   652 \frametitle{\begin{tabular}{@{}c@{}}Take Home Point\end{tabular}}
   617 
   653 
   618 According to Ross Anderson: \bigskip
   654 According to Ross Anderson: \bigskip
   619 \begin{itemize}
   655 \begin{itemize}
       
   656 \item Creating large databases of sensitive personal information is intrinsically 
       
   657 hazardous (NHS)\bigskip
       
   658 
       
   659 
   620 \item Privacy in a big hospital is just about doable.\medskip
   660 \item Privacy in a big hospital is just about doable.\medskip
   621 \item How do you enforce privacy  in something as big as Google
   661 \item How do you enforce privacy  in something as big as Google
   622 or complex as Facebook? No body knows.\bigskip
   662 or complex as Facebook? No body knows.\bigskip
   623 
   663 
   624 Similarly, big databases imposed by government
   664 Similarly, big databases imposed by government