2002-07-29 19:01:38 +00:00
|
|
|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
|
|
|
|
"http://www.w3.org/TR/1999/REC-html401-19991224/loose.dtd">
|
|
|
|
<html lang="en">
|
|
|
|
<head>
|
|
|
|
<title>Security</title>
|
|
|
|
<meta name="generator" content="BBEdit 6.1.2">
|
|
|
|
</head>
|
|
|
|
<body bgcolor="#FFFFFF">
|
|
|
|
<h2>Security</h2>
|
|
|
|
<p>
|
|
|
|
Many types of attacks are possible in public-participation
|
|
|
|
distributed computing.
|
|
|
|
</p>
|
2002-07-07 20:39:24 +00:00
|
|
|
<ul>
|
2002-07-29 19:01:38 +00:00
|
|
|
<li>
|
|
|
|
<b>Result falsification</b>. Attackers return incorrect results.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Credit falsification</b>. Attackers return results claiming
|
|
|
|
more CPU time than was actually used.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Malicious executable distribution</b>. Attackers break into a
|
|
|
|
BOINC server and, by modifying the database and files, attempt to
|
|
|
|
distribute their own executable (e.g. a virus program) disguised as a
|
|
|
|
BOINC application.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Overrun of data server</b>. Attackers repeatedly send large
|
|
|
|
files to BOINC data servers, filling up their disks and rendering them
|
|
|
|
unusable.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Theft of participant account information by server
|
|
|
|
attack</b>. Attackers break into a BOINC server and steal email
|
|
|
|
addresses and other account information.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Theft of participant account information by network
|
|
|
|
attack</b>. Attackers exploit the BOINC network protocols to steal
|
2002-07-07 20:39:24 +00:00
|
|
|
account information.
|
2002-07-29 19:01:38 +00:00
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Theft of project files</b>. Attackers steal input and/or
|
|
|
|
output files.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Intentional abuse of participant hosts by projects</b>. A
|
|
|
|
project intentionally releases an application that abuses participant
|
|
|
|
hosts, e.g. by stealing sensitive information stored in files.
|
|
|
|
</li>
|
|
|
|
<li>
|
|
|
|
<b>Accidental abuse of participant hosts by projects</b>. A
|
|
|
|
project releases an application that unintentionally abuses particpant
|
|
|
|
hosts, e.g. deleting files or causing crashes.
|
|
|
|
</li>
|
2002-07-07 20:39:24 +00:00
|
|
|
</ul>
|
2002-07-29 19:01:38 +00:00
|
|
|
BOINC provides mechanisms to reduce the likelihood of some of these
|
|
|
|
attacks.
|
|
|
|
<p>
|
|
|
|
<b>Result falsification</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
This can be probabilistically detected using redundant computing and
|
|
|
|
result verification: if a majority of results agree (according to an
|
|
|
|
application-specific comparison) then they are classified as correct.
|
|
|
|
Also, ringers or similar schemes can be used to prevent cheating. For
|
|
|
|
information about ringers see the paper "Uncheatable Distributed
|
|
|
|
Computations" by Philippe Golle and Ilya Mironov.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Credit falsification</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
This can be probabilistically detected using redundant computing and
|
|
|
|
credit verification: each participant is given the minimum credit from
|
|
|
|
among the correct results.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Malicious executable distribution</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
BOINC uses code signing to prevent this. Each project has a key pair
|
|
|
|
for code signing. The private key should be kept on a network-isolated
|
|
|
|
machine used for generating digital signatures for executables. The
|
|
|
|
public key is distributed to, and stored on, clients. All files
|
|
|
|
associated with application versions are sent with digital signatures
|
|
|
|
using this key pair.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
Even if attackers break into a project's BOINC servers, they will
|
|
|
|
not be able to cause clients to accept a false code file.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
BOINC provides a mechanism by which projects can periodically change
|
|
|
|
their code-signing key pair. The project generates a new key pair, then
|
|
|
|
(using the code-signing machine) generates a signature for the new
|
|
|
|
public key, signed with the old private key. The core client will accept
|
|
|
|
a new key only if it's signed with the old key. This mechanism is
|
|
|
|
designed to prevent attackers from breaking into a BOINC server and
|
|
|
|
distributing a false key pair.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Overrun of data server</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
Each result file has an associated maximum size. Each project has a
|
|
|
|
<b>upload authentication key pair</b>. The public key is stored on the
|
|
|
|
project's data servers. Result file descriptions are sent to clients
|
|
|
|
with a digital signature, which is forwarded to the data server when the
|
|
|
|
file is uploaded. The data server verifies the file description, and
|
|
|
|
ensures that the amount of data uploaded does not exceed the maximum
|
|
|
|
size.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Theft of participant account information by server attack</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
Each project must address this using conventional security
|
|
|
|
practices. All server machines should be protected by a firewall, and
|
|
|
|
should have all unused network services disabled. Access to these
|
|
|
|
machines should be done only with encrypted protocols like SSH. The
|
|
|
|
machines should be subjected to regular security audits.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
Projects should be undertaken only the organizations that have
|
|
|
|
sufficient expertise and resources to secure their servers. A successful
|
|
|
|
attack could discredit all BOINC-based projects, and
|
|
|
|
public-participation computing in general.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Theft of participant account information by network attack</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
The
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Theft of project files</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
The input and output files used by BOINC applications are not
|
|
|
|
encrypted. Applications can do this themselves, but it has little effect
|
|
|
|
since data resides in cleartext in memory, where it is easy to access
|
|
|
|
with a debugger.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Intentional abuse of participant hosts by projects</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
BOINC does nothing to prevent this (e.g. there is no "sandboxing" of
|
|
|
|
applications). Participants must understand that when they join a BOINC
|
|
|
|
project, they are entrusting the security of their systems to that
|
|
|
|
project.
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
<b>Accidental abuse of participant hosts by projects</b>
|
|
|
|
</p>
|
|
|
|
<p>
|
|
|
|
BOINC does nothing to prevent this. The chances of it happening can
|
|
|
|
be minimized by pre-released application testing. Projects should test
|
|
|
|
their applications thoroughly on all platforms and with all input data
|
|
|
|
scenarios before promoting them to production status.
|
|
|
|
</p>
|
|
|
|
</body>
|
|
|
|
</html>
|