handouts/ho01.tex
changeset 174 e2180cead443
parent 173 9126c13a7d93
child 175 4ebc97e6fdf0
equal deleted inserted replaced
173:9126c13a7d93 174:e2180cead443
     1  \documentclass{article}
     1 \documentclass{article}
     2 \usepackage{../style}
     2 \usepackage{../style}
     3 
     3 \usepackage{../langs}
       
     4 
       
     5 \lstset{language=JavaScript}
     4 
     6 
     5 \begin{document}
     7 \begin{document}
     6 
     8 
     7 \section*{Handout 1 (Security Engineering)}
     9 \section*{Handout 1 (Security Engineering)}
     8 
    10 
    33 itself. You have to look at everything backwards, upside down,
    35 itself. You have to look at everything backwards, upside down,
    34 and sideways. You have to think like an alien.''
    36 and sideways. You have to think like an alien.''
    35 \end{quote}
    37 \end{quote}
    36 
    38 
    37 \noindent In this module I like to teach you this security
    39 \noindent In this module I like to teach you this security
    38 mindset. This might be a mindset that you think is very foreign to you
    40 mindset. This might be a mindset that you think is very
    39 (after all we are all good citizens and not ahck into things). I beg
    41 foreign to you---after all we are all good citizens and not
    40 to differ: You have this mindset already when in school you were
    42 hack into things. I beg to differ: You have this mindset
    41 thinking, at least hypothetically, about in which ways you can cheat in an
    43 already when in school you were thinking, at least
    42 exam (whether it is about hiding notes or looking over the shoulders
    44 hypothetically, about ways in which you can cheat in an exam
    43 of your fellow pupils). Right? To defend a system, you need to have
    45 (whether it is about hiding notes or looking over the
    44 this kind mindset and be able to think like an attacker. This will
    46 shoulders of your fellow pupils). Right? To defend a system,
    45 include understanding techniques that can be used to compromise
    47 you need to have this kind mindset and be able to think like
    46 security and privacy in systems. This will many times result in
    48 an attacker. This will include understanding techniques that
    47 insights where well-intended security mechanisms made a system actually
    49 can be used to compromise security and privacy in systems.
    48 less secure.\smallskip
    50 This will many times result in insights where well-intended
       
    51 security mechanisms made a system actually less
       
    52 secure.\smallskip
    49 
    53 
    50 {\Large\bf Warning!} However, don’t be evil! Using those
    54 {\Large\bf Warning!} However, don’t be evil! Using those
    51 techniques in the real world may violate the law or King’s
    55 techniques in the real world may violate the law or King’s
    52 rules, and it may be unethical. Under some circumstances, even
    56 rules, and it may be unethical. Under some circumstances, even
    53 probing for weaknesses of a system may result in severe
    57 probing for weaknesses of a system may result in severe
    56 responsibility. Ethics requires you to refrain from doing
    60 responsibility. Ethics requires you to refrain from doing
    57 harm. Always respect privacy and rights of others. Do not
    61 harm. Always respect privacy and rights of others. Do not
    58 tamper with any of King's systems. If you try out a technique,
    62 tamper with any of King's systems. If you try out a technique,
    59 always make doubly sure you are working in a safe environment
    63 always make doubly sure you are working in a safe environment
    60 so that you cannot cause any harm, not even accidentally.
    64 so that you cannot cause any harm, not even accidentally.
    61 Don't be evil. Be an ethical hacker.\smallskip
    65 Don't be evil. Be an ethical hacker.\medskip
    62 
    66 
    63 
    67 \noindent
    64 In this lecture I want to make you familiar with the security mindset
    68 In this lecture I want to make you familiar with the security mindset
    65 and dispel the myth that encryption is the answer to all security
    69 and dispel the myth that encryption is the answer to all security
    66 problems (it is certainly often part of an answer, but almost always
    70 problems (it is certainly often part of an answer, but almost always
    67 never a sufficient one). This is actually an important thread going
    71 never a sufficient one). This is actually an important thread going
    68 through the whole course: We will assume that encryption works
    72 through the whole course: We will assume that encryption works
    69 perfectly, but still attack ``things''. By ``works perfectly'' we mean
    73 perfectly, but still attack ``things''. By ``works perfectly'' we mean
    70 that we will assume encryption is a black box and, for example, will
    74 that we will assume encryption is a black box and, for example, will
    71 not look at the underlying mathematics and break the 
    75 not look at the underlying mathematics and break the 
    72 algorithms.\footnote{Though fascinating it might be.}
    76 algorithms.\footnote{Though fascinating this might be.}
    73  
    77  
    74 For a secure system it seems four requirements need to come together:
    78 For a secure system, it seems, four requirements need to come
    75 First a security policy (what is supposed to be achieved?); second a
    79 together: First a security policy (what is supposed to be
    76 mechanism (cipher, access controls, tamper resistance etc); third the
    80 achieved?); second a mechanism (cipher, access controls,
    77 assurance we obtain from the mechanism (the amount of reliance we can
    81 tamper resistance etc); third the assurance we obtain from the
    78 put on the mechanism) and finally the incentives (the motive that the
    82 mechanism (the amount of reliance we can put on the mechanism)
    79 people guarding and maintaining the system have to do their job
    83 and finally the incentives (the motive that the people
    80 properly, and also the motive that the attackers have to try to defeat
    84 guarding and maintaining the system have to do their job
    81 your policy). The last point is often overlooked, but plays an
    85 properly, and also the motive that the attackers have to try
    82 important role. 
    86 to defeat your policy). The last point is often overlooked,
    83 
    87 but plays an important role. To illustrate this lets look at
    84 Lets look at an example. The questions is whether the Chip-and-PIN
    88 an example. 
    85 system with credit cards is more secure than the older method of
    89 
    86 signing receipts at the till. 
    90 The questions is whether the Chip-and-PIN system with credit
    87 
    91 cards is more secure than the older method of signing receipts
    88 
    92 at the till. On first glance, Chip-and PIN seems obviously
    89 
    93 more secure and this was also the central plank in the
       
    94 ``marketing speak'' of the banks behind Chip-and-PIN. The
       
    95 earlier system was based on a magnetic stripe or a mechanical
       
    96 imprint on the card and required customers to sign receipts at
       
    97 the till whenever they bought something. This signature
       
    98 authorises the transactions. Although in use for a long time,
       
    99 this system had some crucial security flaws, including making
       
   100 clones of credit cards and forging signatures. Chip-and-PIN,
       
   101 as the name suggests, relies on data being stored on 
       
   102 a chip on the card and a PIN number for authorisation. 
       
   103 
       
   104 
       
   105 Although the banks involved trumpeted their system as being
       
   106 secure and indeed fraud rates initially went down, security
       
   107 researchers were not convinced (especially the group around
       
   108 Ross Anderson). To begin with, the Chip-and-PIN system
       
   109 introduced a ``new player'' that needed to be trusted: the PIN
       
   110 terminals and their manufacturers. Of course it was claimed
       
   111 that these terminals are tamper-resistant, but needless to say
       
   112 this was a weak link in the system, which criminals
       
   113 successfully attacked. Some terminals were even so skilfully  
       
   114 manipulated that they transmitted PIN numbers via a built-in
       
   115 mobile phone connection. To mitigate this security flaw, you 
       
   116 need to vet quite closely the supply chain of such 
       
   117 terminals---something that also needs to be done in other 
       
   118 industries. 
       
   119 
       
   120 Later on, Ross Anderson and his group managed to launch
       
   121 man-in-the-middle attacks against Chip-and-PIN. Essentially
       
   122 they made the terminal think the correct PIN was entered and
       
   123 the card think that a signature was used. This flaw was
       
   124 mitigated by requiring that a link between the card and the
       
   125 bank is established at every time the card is used. Even
       
   126 later this group found another problem with Chip-and-PIN and
       
   127 ATMs which do not generate random enough numbers (nonces) 
       
   128 on which the security of the underlying protocols relies. 
       
   129 
       
   130 The problem with all this is that the banks who introduced
       
   131 Chip-and-PIN managed to shift the liability for any fraud and
       
   132 the burden of proof onto the customer with the new system. In
       
   133 the old system, the banks had to prove that the customer used
       
   134 the card, which they often did not bother about. In effect if
       
   135 fraud occurred the customers were either refunded fully or
       
   136 lost only a small amount of money. This
       
   137 taking-responsibility-of-potential-fraud was part of the
       
   138 ``business plan'' of the banks and did not reduce their
       
   139 profits too much. Since they successfully claimed that their
       
   140 Chip-and-PIN system is secure, banks were able to point the
       
   141 finger at the customer when fraud occurred: it must have been
       
   142 the fault of the customer, who must have been negligent
       
   143 loosing the PIN. The customer had almost no means to defend
       
   144 themselves in such situations. That is why the work of
       
   145 \emph{ethical} hackers like Ross Anderson's group was so
       
   146 important, because they and others established that the bank's
       
   147 claim, their system is secure and it must have been the
       
   148 customer's fault, was bogus. In 2009 for example the law 
       
   149 changed the burden of proof back to the banks whether
       
   150 it was really the customer who used a card or not.
       
   151 
       
   152 It is a classic example where a security design principle was
       
   153 violated: The one who is in the position to improve security,
       
   154 also needs to bear the financial losses if things go wrong.
       
   155 Otherwise, you end up with an insecure system. In case of the
       
   156 Chip-and-PIN system, no good security engineer would actually
       
   157 think that it is secure: the specification of the EMV protocol
       
   158 (underlying Chip-and-PIN) is some 700 pages long, but still
       
   159 leaves out many things (like how to implement a good random
       
   160 number generator). Moreover, banks can add their own
       
   161 sub-protocols to it. With all the experience we already have,
       
   162 it is as clear as day that criminals were able to poke holes
       
   163 into it. With how the system was set up, the banks had no
       
   164 incentive to come up with a system that is really secure.
       
   165 Getting the incentives right in favour of security is often a
       
   166 tricky business.
       
   167 
       
   168 \subsection*{Of Cookies and Salts}
       
   169 
       
   170 Lets look at another example which helps us to understand how
       
   171 passwords should be verified and stored. Imagine you need to
       
   172 develop a web-application that has the feature of recording
       
   173 how many times a customer visits a page. For example to 
       
   174 give a discount whenever the customer visited a webpage some 
       
   175 $x$ number of times (say $x$ equal $5$). For a number of years
       
   176 the webpage of the New York Times operated in this way: it 
       
   177 allowed you to read ten articles per months for free; if
       
   178 you wanted to read more you had to pay. There is one more
       
   179 constraint: we want to store the information about the number
       
   180 of times a customer has visited inside a cookie. 
       
   181 
       
   182 A typical web-application works as follows: The browser sends
       
   183 a GET request for a particular page to a server. The server 
       
   184 answers is request. A simple JavaScript program that realises
       
   185 a ``hello world'' webpage is as follows:
       
   186 
       
   187 \begin{center}
       
   188 \lstinputlisting{../progs/ap0.js}
       
   189 \end{center}
       
   190 
       
   191 \noindent The interesting lines are 4 to 7 where the answer
       
   192 to the GET request is generated\ldots in this case it is just
       
   193 a simple string. This program is run on the server and will
       
   194 be run whenever a browser initiates such a GET request.
       
   195 
       
   196 For our web-application of interest is the feature that the
       
   197 server when answering the request can store some information
       
   198 on the client. This information is called a \emph{cookie}.
       
   199 The next time the browser makes another GET request to the 
       
   200 same webpage, this cookie can be read by the browser. 
       
   201 Therefore we can use a cookie in order to store a counter
       
   202 recording the number of times a webpage has been visited. 
       
   203 This can be realised with the following small program
       
   204 
       
   205 \begin{center}
       
   206 \lstinputlisting{../progs/ap2.js}
       
   207 \end{center}
       
   208 
       
   209 \noindent The overall structure of this code is the same as
       
   210 the earlier program: Lines 7 to 17 generate the answer to a
       
   211 GET-request. The new part is in Line 8 where we read the
       
   212 cookie called \pcode{counter}. If present, this cookie will be
       
   213 send together with the GET-request from the client. The value
       
   214 of this counter will come in form of a string, therefore we
       
   215 use the function \pcode{parseInt} in order to transform it
       
   216 into a string. In case the cookie is not present, or has been
       
   217 deleted, we default the counter to zero. The odd looking
       
   218 construction \code{...|| 0} is realising this in JavaScript.
       
   219 In Line 9 we increase the counter by one and store it back
       
   220 to the client (under the name \pcode{counter}, since potentially 
       
   221 more than one value could be stored). In Lines 10 to 15 we
       
   222 test whether this counter is greater or equal than 5 and
       
   223 send accordingly a message back to the client.
       
   224 
       
   225 Let us step back and analyse this program from a security
       
   226 perspective. We store a counter in plain text on the client's
       
   227 browser (which is not under our control at all). Depending on
       
   228 this value we want to unlock a resource (like a discount) when
       
   229 it reaches a threshold. If the client deletes the cookie, then
       
   230 the counter will just be reset to zero. This does not bother
       
   231 us, because the purported discount will just be granted later.
       
   232 This does not lose us any (hypothetical) money. What we need
       
   233 to be concerned about is when a client artificially increases
       
   234 this counter without having visited our web-page. This is
       
   235 actually a trivial task for a knowledgeable person, since
       
   236 there are convenient tools that allow us to set a cookie to an
       
   237 arbitrary value, for example above our threshold for the
       
   238 discount. 
       
   239 
       
   240 There is no real way to prevent this kind of tampering with
       
   241 cookies, because the whole purpose of cookies is that they are
       
   242 stored on the client's side, which from the the server's
       
   243 perspective is in a potentially hostile environment. What we
       
   244 need to ensure is the integrity of this counter in this
       
   245 hostile environment. We could think of encrypting the counter.
       
   246 But this has two drawbacks to do with the key for encryption.
       
   247 If you use a `global' key for all our client's that visit our
       
   248 site, then we risk that our whole ``business'' might colapse
       
   249 when this key gets known to the outside world. Suddenly all
       
   250 cookies we might have set in the past, can now be manipulated.
       
   251 If on the other hand, we use a ``private'' key for every
       
   252 client, then we have to solve the problem of having to
       
   253 securely store this key on our server side (obviously we
       
   254 cannot store the key with the client because then the client
       
   255 again has all data to tamper with the counter; and obviously
       
   256 we also cannot encrypt the key, lest we can solve a
       
   257 chicken-and-egg problem). So encryption seems to not solve the
       
   258 problem we face with the integrity of our counter.
       
   259 
       
   260 
       
   261 
       
   262 
       
   263 
       
   264 Note ....NYT 
    90 \end{document}
   265 \end{document}
    91 
   266 
    92 %%% Local Variables: 
   267 %%% Local Variables: 
    93 %%% mode: latex
   268 %%% mode: latex
    94 %%% TeX-master: t
   269 %%% TeX-master: t