tabs menu

Friday 1 May 2015

SECURITY : COMPUTER SYSTEM

Computer security, also known as cybersecurity or IT security, is security applied to computing devices such as computers and smartphones, as well as computer networks such as private and public networks, including the whole Internet. The field includes all the processes and mechanisms by which digital equipment, information and services are protected from unintended or unauthorized access, change or destruction, and is of growing importance due to the increasing reliance of computer systems in most societies.It includes physical security to prevent theft of equipment and information security to protect the data on that equipment. Those terms generally do not refer to physical security, but a common belief among computer security experts is that a physical security breach is one of the worst kinds of security breaches as it generally allows full access to both data and equipment.

Cybersecurity is the process of applying security measures to ensure confidentiality, integrity, and availability of data. Cybersecurity attempts to assure the protection of assets, which includes data, desktops, servers, buildings, and most importantly, humans. The goal of cybersecurity is to protect data both in transit and at rest. Countermeasures can be put in place in order to increase the security of data. Some of these measures include, but are not limited to, access control, awareness training, audit and accountability, risk assessment, penetration testing, vulnerability management, and security assessment and authorization.

Contents 

1 Vulnerabilities
1.1 Backdoors
1.2 Denial-of-service attack
1.3 Direct-access attacks
1.4 Eavesdropping
1.5 Spoofing
1.6 Tampering
1.7 Repudiation
1.8 Information disclosure
1.9 Privilege escalation
1.10 Exploits
1.11 Social engineering and trojans
1.12 Indirect attacks
1.13 Computer crime
2 Vulnerable areas
2.1 Financial systems
2.2 Utilities and industrial equipment
2.3 Aviation
2.4 Consumer devices
2.5 Large corporations
2.6 Automobiles
2.7 Government
3 Financial cost of security breaches
3.1 Reasons
4 Computer protection (countermeasures)
4.1 Security and systems design
4.2 Security measures
4.2.1 Difficulty with response
4.3 Reducing vulnerabilities
4.4 Security by design
4.5 Security architecture
4.6 Hardware protection mechanisms
4.7 Secure operating systems
4.8 Secure coding
4.9 Capabilities and access control lists
4.10 Hacking back
5 Notable computer security attacks and breaches
5.1 Robert Morris and the first computer worm
5.2 Rome Laboratory
5.3 TJX loses 45.7m customer credit card details
5.4 Stuxnet attack
5.5 Global surveillance disclosures
5.6 Target And Home Depot Breaches by Rescator
6 Legal issues and global regulation
7 Government
7.1 Public–private cooperation
8 Actions and teams in the US
8.1 Cybersecurity Act of 2010
8.2 International Cybercrime Reporting and Cooperation Act
8.3 Protecting Cyberspace as a National Asset Act of 2010
8.4 White House proposes cybersecurity legislation
8.5 White House Cybersecurity Summit
8.6 Government initiatives
8.7 Military agencies
8.7.1 Homeland Security
8.7.2 FBI
8.7.3 Department of Justice
8.7.4 USCYBERCOM
8.8 FCC
8.9 Computer Emergency Readiness Team
9 International actions
9.1 Germany
9.1.1 Berlin starts National Cyber Defense Initiative
9.2 South Korea
9.3 India
9.4 Canada
10 National teams
10.1 Europe
10.2 Other countries
11 Cybersecurity and modern warfare
12 The cyber security job market
13 Terminology
14 Scholars

Vulnerabilities

A vulnerability is a weakness which allows an attacker to reduce a system's information assurance. Vulnerability is the intersection of three elements: a system susceptibility or flaw, attacker access to the flaw, and attacker capability to exploit the flaw. To exploit a vulnerability, an attacker must have at least one applicable tool or technique that can connect to a system weakness. In this frame, vulnerability is also known as the attack surface.

A large number of vulnerabilities are documented in the Common Vulnerabilities and Exposures (CVE) database www.cve.org

Vulnerability management is the cyclical practice of identifying, classifying, remediating, and mitigating vulnerabilities. This practice generally refers to software vulnerabilities in computing systems.

A security risk may be classified as a vulnerability. The use of vulnerability with the same meaning of risk can lead to confusion. The risk is tied to the potential of a significant loss. There can also be vulnerabilities without risk, like when the asset has no value. A vulnerability with one or more known (publicly or privately) instances of working and fully implemented attacks is classified as an exploitable vulnerability- a vulnerability for which an exploit exists. To exploit those vulnerabilities, perpetrators (individual hacker, criminal organization, or a nation state) most commonly use malware (malicious software), worms, viruses, and targeted attacks.

Different scales exist to assess the risk of an attack. In the United States, authorities use the Information Operations Condition (INFOCON) system. This system is scaled from 5 to 1 (INFOCON 5 being an harmless situation and INFOCON 1 representing the most critical threats).

To understand the techniques for securing a computer system, it is important to first understand the various types of "attacks" that can be made against it. These threats can typically be classified into one of the categories in the section below.

Backdoors

A backdoor in a computer system, a cryptosystem or an algorithm, is a method of bypassing normal authentication, securing remote access to a computer, obtaining access to plaintext, and so on, while attempting to remain undetected. A special form of asymmetric encryption attacks, known as kleptographic attack, resists to be useful to the reverse engineer even after it is detected and analyzed.

The backdoor may take the form of an installed program (e.g., Back Orifice), or could be a modification to an existing program or hardware device. A specific form of backdoor is a rootkit, which replaces system binaries and/or hooks into the function calls of an operating system to hide the presence of other programs, users, services and open ports. It may also fake information about disk and memory usage.

Denial-of-service attack

Unlike other exploits, denial of service attacks are not used to gain unauthorized access or control of a system. They are instead designed to render it unusable. Attackers can deny service to individual victims, such as by deliberately entering a wrong password enough consecutive times to cause the victim account to be locked, or they may overload the capabilities of a machine or network and block all users at once. These types of attack are, in practice, difficult to prevent, because the behaviour of whole networks needs to be analyzed, not just the behaviour of small pieces of code. Distributed denial of service (DDoS) attacks, where a large number of compromised hosts (commonly referred to as "zombie computers", used as part of a botnet with, for example, a worm, trojan horse, or backdoor exploit to control them) are used to flood a target system with network requests, thus attempting to render it unusable through resource exhaustion, are common. Another technique to exhaust victim resources is through the use of an attack amplifier, where the attacker takes advantage of poorly designed protocols on third-party machines, such as NTP or DNS, in order to instruct these hosts to launch the flood. Some vulnerabilities in applications or operating systems can be exploited to make the computer or application malfunction or crash to create a denial-of-service.

Direct-access attacks

Common consumer devices that can be used to transfer data surreptitiously.
An unauthorized user gaining physical access to a computer (or part thereof) can perform many functions or install different types of devices to compromise security, including operating system modifications, software worms, keyloggers, and covert listening devices. The attacker can also easily download large quantities of data onto backup media, like CD-R/DVD-R or portable devices such as flash drives, digital cameras or digital audio players. Another common technique is to boot an operating system contained on a CD-ROM or other bootable media and read the data from the harddrive(s) this way. The only way to prevent this is to encrypt the storage media and store the key separate from the system. Direct-access attacks are the only type of threat to air gapped computers in most cases.

Eavesdropping

Eavesdropping is the act of surreptitiously listening to a private conversation, typically between hosts on a network. For instance, programs such as Carnivore and NarusInsight have been used by the FBI and NSA to eavesdrop on the systems of internet service providers. Even machines that operate as a closed system (i.e., with no contact to the outside world) can be eavesdropped upon via monitoring the faint electro-magnetic transmissions generated by the hardware; TEMPEST is a specification by the NSA referring to these attacks.

Spoofing

Spoofing of user identity describes a situation in which one person or program successfully masquerades as another by falsifying data.

Tampering

Tampering describes an intentional modification of products in a way that would make them harmful to the consumer.

Repudiation

Repudiation describes a situation where the authenticity of a signature is being challenged.

Information disclosure

Information disclosure (privacy breach or data leak) describes a situation where information, thought to be secure, is released in an untrusted environment.

Privilege escalation

Privilege escalation describes a situation where an attacker gains elevated privileges or access to resources that were once restricted to them.

Exploits

An exploit is a software tool designed to take advantage of a flaw in a computer system. This frequently includes gaining control of a computer system, allowing privilege escalation, or creating a denial of service attack. The code from exploits is frequently reused in trojan horses and computer viruses. In some cases, a vulnerability can lie in certain programs' processing of a specific file type, such as a non-executable media file. Some security web sites maintain lists of currently known unpatched vulnerabilities found in common programs.

Social engineering and trojans

A computer system is no more secure than the persons responsible for its operation. Malicious individuals have regularly penetrated well-designed, secure computer systems by taking advantage of the carelessness of trusted individuals, or by deliberately deceiving them, for example sending messages that they are the system administrator and asking for passwords. This deception is known as social engineering.

In the world of information technology there are different types of cyber attack–like code injection to a website or utilising malware (malicious software) such as virus, trojans, or similar. Attacks of these kinds are counteracted managing or improving the damaged product. But there is one last type, social engineering, which does not directly affect the computers but instead their users, which are also known as "the weakest link". This type of attack is capable of achieving similar results to other class of cyber attacks, by going around the infrastructure established to resist malicious software; since being more difficult to calculate or prevent, it is many times a more efficient attack vector.

The main target is to convince the user by means of psychological ways to disclose secrets such as passwords, card numbers, etc. by, for example, impersonating a bank, a contractor, or a customer.

Indirect attacks

An indirect attack is an attack launched by a third-party computer. By using someone else's computer to launch an attack, it becomes far more difficult to track down the actual attacker. There have also been cases where attackers took advantage of public anonymizing systems, such as the Tor onion router system.

Computer crime

Computer crime refers to any crime that involves a computer and a network.

Vulnerable areas

Computer security is critical in almost any industry which uses computers.

Financial systems

Web sites that accept or store credit card numbers and bank account information are prominent hacking targets, because of the potential for immediate financial gain from transferring money, making purchases, or selling the information on the black market. In-store payment systems and ATMs have also been tampered with in order to gather customer account data and PINs.

Utilities and industrial equipment

Computers control functions at many utilities, including coordination of telecommunications, the power grid, nuclear power plants, and valve opening and closing in water and gas networks. The Internet is a potential attack vector for such machines if connected, but the Stuxnet worm demonstrated that even equipment controlled by computers not connected to the Internet can be vulnerable to physical damage caused by malicious commands sent to industrial equipment (in that case uranium enrichment centrifuges) which are infected via removable media. In 2014, the Computer Emergency Readiness Team, a division of the Department of Homeland Security, investigated 79 hacking incidents at energy companies.

Aviation

The aviation industry is especially important when analyzing computer security because the involved risks include human life, expensive equipment, cargo, and transportation infrastructure. Security can be compromised by hardware and software malpractice, human error, and faulty operating environments. Threats that exploit computer vulnerabilities can stem from sabotage, espionage, industrial competition, terrorist attack, mechanical malfunction, and human error.

The consequences of a successful deliberate or inadvertent misuse of a computer system in the aviation industry range from loss of confidentiality to loss of system integrity, which may lead to more serious concerns such as exfiltration (data theft or loss), network and air traffic control outages, which in turn can lead to airport closures, loss of aircraft, loss of passenger life. Military systems that control munitions can pose an even greater risk.

A proper attack does not need to be very high tech or well funded; for a power outage at an airport alone can cause repercussions worldwide.One of the easiest and, arguably, the most difficult to trace security vulnerabilities is achievable by transmitting unauthorized communications over specific radio frequencies. These transmissions may spoof air traffic controllers or simply disrupt communications altogether.Controlling aircraft over oceans is especially dangerous because radar surveillance only extends 175 to 225 miles offshore. Beyond the radar's sight controllers must rely on periodic radio communications with a third party. Another attack vector of concern is onboard wifi systems.

Consumer devices

Desktop computers and laptops are commonly infected with malware, either to gather passwords or financial account information, or to construct a botnet to attack another target. Smart phones, tablet computers, smart watches, and other mobile devices have also recently become targets for malware.

Many smartphones have cameras, microphones, GPS receivers, compasses, and accelerometers. Many Quantified Self devices, such as activity trackers, and mobile apps collect personal information, such as heartbeat, diet, notes on activities (from exercise in public to sexual activities), and performance of bodily functions. Wifi, Bluetooth, and cell phone network devices can be used as attack vectors, and sensors might be remotely activated after a successful attack. Many mobile applications do not use encryption to transmit this data, nor to protect usernames and passwords, leaving the devices and the web sites where data is stored vulnerable to monitoring and break-ins.

Hacking techniques have also been demonstrated against home automation devices such as the Nest thermostat.

Large corporations

Data breaches at large corporations have become common, largely for financial gain through identity theft. Notably, the 2014 Sony Pictures Entertainment hack was allegedly carried out by the government of North Korea or its supporters, in retaliation for an unflattering caricature and fictional assassination of supreme leader Kim Jong-un.

Automobiles

With physical access to a car's internal controller area network, hackers have demonstrated the ability to disable the brakes and turn the steering wheel.Computerized engine timing, cruise control, anti-lock brakes, seat belt tensioners, door locks, airbags and advanced driver assistance systems make these disruptions possible, and self-driving cars go even further. Connected cars may use wifi and bluetooth to communicate with onboard consumer devices, and the cell phone network to contact concierge and emergency assistance services or get navigational or entertainment information; each of these networks is a potential entry point for malware or an attacker.Researchers in 2011 were even able to use a malicious compact disc in a car's stereo system as a successful attack vector,and cars with built-in voice recognition or remote assistance features have onboard microphones which could be used for eavesdropping. A 2015 report by U.S. Senator Edward Markey criticized manufacturers' security measures as inadequate and also highlighted privacy concerns about driving, location, and diagnostic data being collected, which is vulnerable to abuse by both manufacturers and hackers.

Government:

Military installations have been the target of hacks; vital government infrastructure such as traffic light controls, police and intelligence agency communications, and financial systems are also potential targets as they become computerized.

Financial cost of security breaches:

Serious financial damage has been caused by security breaches, but because there is no standard model for estimating the cost of an incident, the only data available is that which is made public by the organizations involved. “Several computer security consulting firms produce estimates of total worldwide losses attributable to virus and worm attacks and to hostile digital acts in general. The 2003 loss estimates by these firms range from $13 billion (worms and viruses only) to $226 billion (for all forms of covert attacks). The reliability of these estimates is often challenged; the underlying methodology is basically anecdotal.”

Insecurities in operating systems have led to a massive black market for rogue software. An attacker can use a security hole to install software that tricks the user into buying a product. At that point, an affiliate program pays the affiliate responsible for generating that installation about $30. The software is sold for between $50 and $75 per license.

Reasons:

There are many similarities (yet many fundamental differences) between computer and physical security. Just like real-world security, the motivations for breaches of computer security vary between attackers, sometimes called hackers or crackers. Some are thrill-seekers or vandals (the kind often responsible for defacing web sites); similarly, some web site defacements are done to make political statements. However, some attackers are highly skilled and motivated with the goal of compromising computers for financial gain or espionage. An example of the latter is Markus Hess (more diligent than skilled), who spied for the KGB and was ultimately caught because of the efforts of Clifford Stoll, who wrote a memoir, The Cuckoo's Egg, about his experiences.

For those seeking to prevent security breaches, the first step is usually to attempt to identify what might motivate an attack on the system, how much the continued operation and information security of the system are worth, and who might be motivated to breach it. The precautions required for a home personal computer are very different for those of banks' Internet banking systems, and different again for a classified military network. Other computer security writers suggest that, since an attacker using a network need know nothing about you or what you have on your computer, attacker motivation is inherently impossible to determine beyond guessing. If true, blocking all possible attacks is the only plausible action to take.

Computer protection (countermeasures):

There are numerous ways to protect computers, including utilizing security-aware design techniques, building on secure operating systems and installing hardware devices designed to protect the computer systems.

In general, a countermeasure is a measure or action taken to counter or offset another one. In computer security a countermeasure is defined as an action, device, procedure, or technique that reduces a threat, a vulnerability, or an attack by eliminating or preventing it, by minimizing the harm it can cause, or by discovering and reporting it so that corrective action can be taken. An alternate meaning of countermeasure from the InfosecToday glossary is:

The deployment of a set of security services to protect against a security threat.
Security and systems design:
Although there are many aspects to take into consideration when designing a computer system, security can prove to be very important. According to Symantec, in 2010, 94 percent of organizations polled expect to implement security improvements to their computer systems, with 42 percent claiming cyber security as their top risk.

At the same time, many organizations are improving security and many types of cyber criminals are finding ways to continue their activities. Almost every type of cyber attack is on the rise. In 2009 respondents to the CSI Computer Crime and Security Survey admitted that malware infections, denial-of-service attacks, password sniffing, and web site defacements were significantly higher than in the previous two years.

Security measures:

A state of computer "security" is the conceptual ideal, attained by the use of the three processes: threat prevention, detection, and response. These processes are based on various policies and system components, which include the following:

User account access controls and cryptography can protect systems files and data, respectively.
Firewalls are by far the most common prevention systems from a network security perspective as they can (if properly configured) shield access to internal network services, and block certain kinds of attacks through packet filtering. Firewalls can be both hardware- or software-based.
Intrusion Detection Systems (IDSs) are designed to detect network attacks in progress and assist in post-attack forensics, while audit trails and logs serve a similar function for individual systems.
"Response" is necessarily defined by the assessed security requirements of an individual system and may cover the range from simple upgrade of protections to notification of legal authorities, counter-attacks, and the like. In some special cases, a complete destruction of the compromised system is favored, as it may happen that not all the compromised resources are detected.
Today, computer security comprises mainly "preventive" measures, like firewalls or an exit procedure. A firewall can be defined as a way of filtering network data between a host or a network and another network, such as the Internet, and can be implemented as software running on the machine, hooking into the network stack (or, in the case of most UNIX-based operating systems such as Linux, built into the operating system kernel) to provide real time filtering and blocking. Another implementation is a so-called physical firewall which consists of a separate machine filtering network traffic. Firewalls are common amongst machines that are permanently connected to the Internet.

However, relatively few organisations maintain computer systems with effective detection systems, and fewer still have organised response mechanisms in place. As result, as Reuters points out: “Companies for the first time report they are losing more through electronic theft of data than physical stealing of assets”.The primary obstacle to effective eradication of cyber crime could be traced to excessive reliance on firewalls and other automated "detection" systems. Yet it is basic evidence gathering by using packet capture appliances that puts criminals behind bars.

Difficulty with response:

Responding forcefully to attempted security breaches (in the manner that one would for attempted physical security breaches) is often very difficult for a variety of reasons:

Identifying attackers is difficult, as they are often in a different jurisdiction to the systems they attempt to breach, and operate through proxies, temporary anonymous dial-up accounts, wireless connections, and other anonymising procedures which make backtracing difficult and are often located in yet another jurisdiction. If they successfully breach security, they are often able to delete logs to cover their tracks.
The sheer number of attempted attacks is so large that organisations cannot spend time pursuing each attacker (a typical home user with a permanent (e.g., cable modem) connection will be attacked at least several times per day, so more attractive targets could be presumed to see many more). Note however, that most of the sheer bulk of these attacks are made by automated vulnerability scanners and computer worms.
Law enforcement officers are often unfamiliar with information technology, and so lack the skills and interest in pursuing attackers. There are also budgetary constraints. It has been argued that the high cost of technology, such as DNA testing, and improved forensics mean less money for other kinds of law enforcement, so the overall rate of criminals not getting dealt with goes up as the cost of the technology increases. In addition, the identification of attackers across a network may require logs from various points in the network and in many countries, the release of these records to law enforcement (with the exception of being voluntarily surrendered by a network administrator or a system administrator) requires a search warrant and, depending on the circumstances, the legal proceedings required can be drawn out to the point where the records are either regularly destroyed, or the information is no longer relevant.

Reducing vulnerabilities:

Computer code is regarded by some as a form of mathematics. It is theoretically possible to prove the correctness of certain classes of computer programs, though the feasibility of actually achieving this in large-scale practical systems is regarded as small by some with practical experience in the industry; see Bruce Schneier et al.

It is also possible to protect messages in transit (i.e., communications) by means of cryptography. One method of encryption—the one-time pad—is unbreakable when correctly used. This method was used by the Soviet Union during the Cold War, though flaws in their implementation allowed some cryptanalysis; see the Venona project. The method uses a matching pair of key-codes, securely distributed, which are used once-and-only-once to encode and decode a single message. For transmitted computer encryption this method is difficult to use properly (securely), and highly inconvenient as well. Other methods of encryption, while breakable in theory, are often virtually impossible to directly break by any means publicly known today. Breaking them requires some non-cryptographic input, such as a stolen key, stolen plaintext (at either end of the transmission), or some other extra cryptanalytic information.

Social engineering and direct computer access (physical) attacks can only be prevented by non-computer means, which can be difficult to enforce, relative to the sensitivity of the information. Even in a highly disciplined environment, such as in military organizations, social engineering attacks can still be difficult to foresee and prevent.

Trusting computer program code to behave securely has been pursued for decades. It has proven difficult to determine what code 'will never do.' Mathematical proofs are illusive in part because it is so difficult to define secure behavior even notionally, let alone mathematically. In practice, only a small fraction of computer program code is mathematically proven, or even goes through comprehensive information technology audits or inexpensive but extremely valuable computer security audits, so it is usually possible for a determined hacker to read, copy, alter or destroy data in well secured computers, albeit at the cost of great time and resources. Few attackers would audit applications for vulnerabilities just to attack a single specific system. It is possible to reduce an attacker's chances by keeping systems up to date, using a security scanner or/and hiring competent people responsible for security. The effects of data loss/damage can be reduced by careful backing up and insurance. However software-based strategies have not yet been discovered for protecting computers from adequately funded, dedicated malicious attacks.

Security by design:

Security by design, or alternately secure by design, means that the software has been designed from the ground up to be secure. In this case, security is considered as a main feature.

Some of the techniques in this approach include:


The principle of least privilege, where each part of the system has only the privileges that are needed for its function. That way even if an attacker gains access to that part, they have only limited access to the whole system.
Automated theorem proving to prove the correctness of crucial software subsystems.
Code reviews and unit testing, approaches to make modules more secure where formal correctness proofs are not possible.
Defense in depth, where the design is such that more than one subsystem needs to be violated to compromise the integrity of the system and the information it holds.
Default secure settings, and design to "fail secure" rather than "fail insecure" (see fail-safe for the equivalent in safety engineering). Ideally, a secure system should require a deliberate, conscious, knowledgeable and free decision on the part of legitimate authorities in order to make it insecure.
Audit trails tracking system activity, so that when a security breach occurs, the mechanism and extent of the breach can be determined. Storing audit trails remotely, where they can only be appended to, can keep intruders from covering their tracks.
Full disclosure of all vulnerabilities, to ensure that the "window of vulnerability" is kept as short as possible when bugs are discovered.

Security architecture:

The Open Security Architecture organization defines IT security architecture as "the design artifacts that describe how the security controls (security countermeasures) are positioned, and how they relate to the overall information technology architecture. These controls serve the purpose to maintain the system's quality attributes: confidentiality, integrity, availability, accountability and assurance services".

Techopedia defines security architecture as "a unified security design that addresses the necessities and potential risks involved in a certain scenario or environment. It also specifies when and where to apply security controls. The design process is generally reproducible." The key attributes of security architecture are the relationship of different components and how they depend on each other.
the determination of controls based on risk assessment, good practice, finances, and legal matters.
the standardization of controls.

Hardware protection mechanisms:

While hardware may be a source of insecurity, such as with microchip vulnerabilities maliciously introduced during the manufacturing process,hardware-based or assisted computer security also offers an alternative to software-only computer security. Using devices and methods such as dongles, trusted platform modules, intrusion-aware cases, drive locks, disabling USB ports, and mobile-enabled access may be considered more secure due to the physical access (or sophisticated backdoor access) required in order to be compromised. Each of these is covered in more detail below.

USB dongles are typically used in software licensing schemes to unlock software capabilities,but they can also be seen as a way to prevent unauthorized access to a computer or other device's software. The dongle, or key, essentially creates a secure encrypted tunnel between the software application and the key. The principle is that an encryption scheme on the dongle, such as Advanced Encryption Standard (AES) provides a stronger measure of security, since it is harder to hack and replicate the dongle than to simply copy the native software to another machine and use it. Another security application for dongles is to use them for accessing web-based content such as cloud software or Virtual Private Networks (VPNs).In addition, a USB dongle can be configured to lock or unlock a computer.
Trusted platform modules (TPMs) secure devices by integrating cryptographic capabilities onto access devices, through the use of microprocessors, or so-called computers-on-a-chip. TPMs used in conjunction with server-side software offer a way to detect and authenticate hardware devices, preventing unauthorized network and data access.
Computer case intrusion detection refers to a push-button switch which is triggered when a computer case is opened. The firmware or BIOS is programmed to show an alert to the operator when the computer is booted up the next time.
Drive locks are essentially software tools to encrypt hard drives, making them inaccessible to thieves.Tools exist specifically for encrypting external drives as well.
Disabling USB ports is a security option for preventing unauthorized and malicious access to an otherwise secure computer. Infected USB dongles connected to a network from a computer inside the firewall are considered by Network World as the most common hardware threat facing computer networks.
Mobile-enabled access devices are growing in popularity due to the ubiquitous nature of cell phones. Built-in capabilities such as Bluetooth, the newer Bluetooth low energy (LE), Near field communication (NFC) on non-iOS devices and biometric validation such as thumb print readers, as well as QR code reader software designed for mobile devices, offer new, secure ways for mobile phones to connect to access control systems. These control systems provide computer security and can also be used for controlling access to secure buildings.

Secure operating systems:

One use of the term "computer security" refers to technology that is used to implement secure operating systems. Much of this technology is based on science developed in the 1980s and used to produce what may be some of the most impenetrable operating systems ever. Though still valid, the technology is in limited use today, primarily because it imposes some changes to system management and also because it is not widely understood. Such ultra-strong secure operating systems are based on operating system kernel technology that can guarantee that certain security policies are absolutely enforced in an operating environment. An example of such a Computer security policy is the Bell-LaPadula model. The strategy is based on a coupling of special microprocessor hardware features, often involving the memory management unit, to a special correctly implemented operating system kernel. This forms the foundation for a secure operating system which, if certain critical parts are designed and implemented correctly, can ensure the absolute impossibility of penetration by hostile elements. This capability is enabled because the configuration not only imposes a security policy, but in theory completely protects itself from corruption. Ordinary operating systems, on the other hand, lack the features that assure this maximal level of security. The design methodology to produce such secure systems is precise, deterministic and logical.

Systems designed with such methodology represent the state of the art of computer security although products using such security are not widely known. In sharp contrast to most kinds of software, they meet specifications with verifiable certainty comparable to specifications for size, weight and power. Secure operating systems designed this way are used primarily to protect national security information, military secrets, and the data of international financial institutions. These are very powerful security tools and very few secure operating systems have been certified at the highest level (Orange Book A-1) to operate over the range of "Top Secret" to "unclassified" (including Honeywell SCOMP, USAF SACDIN, NSA Blacker and Boeing MLS LAN). The assurance of security depends not only on the soundness of the design strategy, but also on the assurance of correctness of the implementation, and therefore there are degrees of security strength defined for COMPUSEC. The Common Criteria quantifies security strength of products in terms of two components, security functionality and assurance level (such as EAL levels), and these are specified in a Protection Profile for requirements and a Security Target for product descriptions. None of these ultra-high assurance secure general purpose operating systems have been produced for decades or certified under Common Criteria.

In USA parlance, the term High Assurance usually suggests the system has the right security functions that are implemented robustly enough to protect DoD and DoE classified information. Medium assurance suggests it can protect less valuable information, such as income tax information. Secure operating systems designed to meet medium robustness levels of security functionality and assurance have seen wider use within both government and commercial markets. Medium robust systems may provide the same security functions as high assurance secure operating systems but do so at a lower assurance level (such as Common Criteria levels EAL4 or EAL5). Lower levels mean we can be less certain that the security functions are implemented flawlessly, and therefore less dependable. These systems are found in use on web servers, guards, database servers, and management hosts and are used not only to protect the data stored on these systems but also to provide a high level of protection for network connections and routing services.

Secure coding:

If the operating environment is not based on a secure operating system capable of maintaining a domain for its own execution, and capable of protecting application code from malicious subversion, and capable of protecting the system from subverted code, then high degrees of security are understandably not possible. While such secure operating systems are possible and have been implemented, most commercial systems fall in a 'low security' category because they rely on features not supported by secure operating systems (like portability, and others). In low security operating environments, applications must be relied on to participate in their own protection. There are 'best effort' secure coding practices that can be followed to make an application more resistant to malicious subversion.

In commercial environments, the majority of software subversion vulnerabilities result from a few known kinds of coding defects. Common software defects include buffer overflows, format string vulnerabilities, integer overflow, and code/command injection. These defects can be used to cause the target system to execute putative data. However, the "data" contain executable instructions, allowing the attacker to gain control of the processor.

Some common languages such as C and C++ are vulnerable to all of these defects (see Seacord, "Secure Coding in C and C++").Other languages, such as Java, are more resistant to some of these defects, but are still prone to code/command injection and other software defects which facilitate subversion.

Another bad coding practice occurs when an object is deleted during normal operation yet the program neglects to update any of the associated memory pointers, potentially causing system instability when that location is referenced again. This is called dangling pointer, and the first known exploit for this particular problem was presented in July 2007. Before this publication the problem was known but considered to be academic and not practically exploitable.

Unfortunately, there is no theoretical model of "secure coding" practices, nor is one practically achievable, insofar as the code (ideally, read-only) and data (generally read/write) generally tends to have some form of defect.

Capabilities and access control lists:

Within computer systems, two of many security models capable of enforcing privilege separation are access control lists (ACLs) and capability-based security. Using ACLs to confine programs has been proven to be insecure in many situations, such as if the host computer can be tricked into indirectly allowing restricted file access, an issue known as the confused deputy problem. It has also been shown that the promise of ACLs of giving access to an object to only one person can never be guaranteed in practice. Both of these problems are resolved by capabilities. This does not mean practical flaws exist in all ACL-based systems, but only that the designers of certain utilities must take responsibility to ensure that they do not introduce flaws.

Capabilities have been mostly restricted to research operating systems, while commercial OSs still use ACLs. Capabilities can, however, also be implemented at the language level, leading to a style of programming that is essentially a refinement of standard object-oriented design. An open source project in the area is the E language.

The most secure computers are those not connected to the Internet and shielded from any interference. In the real world, the most secure systems are operating systems where security is not an add-on.

Hacking back:

There has been a significant debate regarding the legality of hacking back against digital attackers (who attempt to or successfully breach an individual's, entity's, or nation's computer). The arguments for such counter-attacks are based on notions of equity, active defense, vigilantism, and the Computer Fraud and Abuse Act (CFAA). The arguments against the practice are primarily based on the legal definitions of "intrusion" and "unauthorized access", as defined by the CFAA. As of October 2012, the debate is ongoing.

Notable computer security attacks and breaches:

Some illustrative examples of different types of computer security breaches are given below.

Robert Morris and the first computer worm:

In 1988, only 60,000 computers were connected to the Internet, and most were mainframes, minicomputers and professional workstations. On November 2, 1988, many started to slow down, because they were running a malicious code that demanded processor time and that spread itself to other computers - the first internet "computer worm".The software was traced back to 23 year old Cornell University graduate student Robert Tappan Morris, Jr. who said 'he wanted to count how many machines were connected to the Internet'.

Rome Laboratory:

In 1994, over a hundred intrusions were made by unidentified crackers into the Rome Laboratory, the US Air Force's main command and research facility. Using trojan horses, hackers were able to obtain unrestricted access to Rome's networking systems and remove traces of their activities. The intruders were able to obtain classified files, such as air tasking order systems data and furthermore able to penetrate connected networks of National Aeronautics and Space Administration's Goddard Space Flight Center, Wright-Patterson Air Force Base, some Defense contractors, and other private sector organizations, by posing as a trusted Rome center user.

TJX loses 45.7m customer credit card details:

In early 2007, American apparel and home goods company TJX announced that it was the victim of an unauthorized computer systems intrusion and that the hackers had accessed a system that stored data on credit card, debit card, check, and merchandise return transactions.

Stuxnet attack:

The computer worm known as Stuxnet reportedly ruined almost one-fifth of Iran's nuclear centrifuges by disrupting industrial programmable logic controllers (PLCs) in a targeted attack generally believed to have been launched by Israel and the United States although neither has publicly acknowledged this.

Global surveillance disclosures:

In early 2013, thousands of thousands of classified documentswere disclosed by NSA contractor Edward Snowden. Called the "most significant leak in U.S. history" it also revealed for the first time the massive breaches of computer security by the NSA, including deliberately inserting a backdoor in a NIST standard for encryption and tapping the links between Google's data centres.

Target And Home Depot Breaches by Rescator:

In 2013 and 2014, a Russian/Ukrainian hacking ring known as "Rescator" broke into Target Corporation computers in 2013, stealing roughly 40 million credit cards, and then Home Depot computers in 2014, stealing between 53 and 56 million credit card numbers.Warnings were delivered at both corporations, but ignored; physical security breaches using self checkout machines are believed to have played a large role. “The malware utilized is absolutely unsophisticated and uninteresting,” says Jim Walter, director of threat intelligence operations at security technology company McAfee - meaning that the heists could have easily been stopped by existing antivirus software had administrators responded to the warnings. The size of the thefts has resulted in major attention from state and Federal United States authorities and the investigation is ongoing.

Legal issues and global regulation:

Conflict of laws in cyberspace has become a major cause of concern for computer security community. Some of the main challenges and complaints about the antivirus industry are the lack of global web regulations, a global base of common rules to judge, and eventually punish, cyber crimes and cyber criminals. There is no global cyber law and cyber security treaty that can be invoked for enforcing global cyber security issues.

International legal issues of cyber attacks are really tricky and complicated in nature. For instance, even if an antivirus firm locates the cyber criminal behind the creation of a particular virus or piece of malware or again one form of cyber attack, often the local authorities cannot take action due to lack of laws under which to prosecute.This is mainly caused by the fact that many countries have their own regulations regarding cyber crimes. Authorship attribution for cyber crimes and cyber attacks has become a major problem for international law enforcement agencies.

"(Computer viruses) switch from one country to another, from one jurisdiction to another — moving around the world, using the fact that we don't have the capability to globally police operations like this. So the Internet is as if someone had given free plane tickets to all the online criminals of the world." (Mikko Hyppönen) Use of dynamic DNS, fast flux and bullet proof servers have added own complexities to this situation.

Businesses are eager to expand to less developed countries due to the low cost of labor, says White et al. (2012). However, these countries are the ones with the least amount of Internet safety measures, and the Internet Service Providers are not so focused on implementing those safety measures (2010). Instead, they are putting their main focus on expanding their business, which exposes them to an increase in criminal activity.

In response to the growing problem of cyber crime, the European Commission established the European Cybercrime Centre (EC3).The EC3 effectively opened on 1 January 2013 and will be the focal point in the EU's fight against cyber crime, contributing to faster reaction to online crimes. It will support member states and the EU's institutions in building an operational and analytical capacity for investigations, as well as cooperation with international partners.

Government:

The role of the government is to make regulations to force companies and organizations to protect their system, infrastructure and information from any cyber attacks, but also to protect its own national infrastructure such as the national power-grid.

The question of whether the government should intervene or not in the regulation of the cyberspace is a very polemical one. Indeed, for as long as it has existed and by definition, the cyberspace is a virtual space free of any government intervention. Where everyone agree that an improvement on cybersecurity is more than vital, is the government the best actor to solve this issue? Many government officials and experts think that the government should step in and that there is a crucial need for regulation, mainly due to the failure of the private sector to solve efficiently the cybersecurity problem. R. Clarke said during a panel discussion at the RSA Security Conference in San Francisco, he believes that the "industry only responds when you threaten regulation. If industry doesn't respond (to the threat), you have to follow through."On the other hand, executives from the private sector agree that improvements are necessary but think that the government intervention would affect their ability to innovate efficiently.

Public–private cooperation:

The cybersecurity act of 2010 establishes the creation of an advisory panel, each member of this panel will be appointed by the President of the United-States. They must represent the private sector, the academic sector, the public sector and the non-profit organisations.The purpose of the panel is to advise the government as well as help improve strategies.

Actions and teams in the US:

Cybersecurity Act of 2010:

The "Cybersecurity Act of 2010 - S. 773" (full text) was introduced first in the Senate on April 1, 2009 by Senator Jay Rockefeller (D-WV), Senator Evan Bayh (D-IN), Senator Barbara Mikulski (D-MD), Senator Bill Nelson (D-FL), and Senator Olympia Snowe (R-ME). The revised version was approved on March 24, 2009.
The main objective of the bill is to increase collaboration between the public and the private sector on the issue of cybersecurity. But also "to ensure the continued free flow of commerce within the United States and with its global trading partners through secure cyber communications, to provide for the continued development and exploitation of the Internet and intranet communications for such purposes, to provide for the development of a cadre of information technology specialists to improve and maintain effective cybersecurity defenses against disruption, and for other purposes."
The act also wants to instate new higher standards, processes, technologies and protocols to ensure the security of the "critical infrastructure".

International Cybercrime Reporting and Cooperation Act:

On March 25, 2010, Representative Yvette Clarke (D-NY) introduced the "International Cybercrime Reporting and Cooperation Act - H.R.4962" in the House of Representatives; the bill, co-sponsored by seven other representatives (among whom only one Republican), was referred to three House committees.The bill seeks to make sure that the administration keeps Congress informed on information infrastructure, cybercrime, and end-user protection worldwide. It also "directs the President to give priority for assistance to improve legal, judicial, and enforcement capabilities with respect to cybercrime to countries with low information and communications technology levels of development or utilization in their critical infrastructure, telecommunications systems, and financial industries" as well as to develop an action plan and an annual compliance assessment for countries of "cyber concern".

Protecting Cyberspace as a National Asset Act of 2010:

On June 19, 2010, United States Senator Joe Lieberman (I-CT) introduced a bill called "Protecting Cyberspace as a National Asset Act of 2010 - S.3480" which he co-wrote with Senator Susan Collins (R-ME) and Senator Thomas Carper (D-DE). If signed into law, this controversial bill, which the American media dubbed the "Kill switch bill", would grant the President emergency powers over the Internet. However, all three co-authors of the bill issued a statement claiming that instead, the bill " existing broad Presidential authority to take over telecommunications networks".

White House proposes cybersecurity legislation:

On May 12, 2011, the White House sent Congress a proposed cybersecurity law designed to force companies to do more to fend off cyberattacks, a threat that has been reinforced by recent reports about vulnerabilities in systems used in power and water utilities.

Executive order 13636 Improving Critical Infrastructure Cybersecurity was signed February 12, 2013.

White House Cybersecurity Summit:

President Obama called for a cybersecurity summit, held at Stanford University in February 2015.

Government initiatives:

The government put together several different websites to inform, share and analyze information. Those websites are targeted to different "audiences":

the government itself: states, cities, counties
the public sector
the private sector
the end-user
Here are a few examples :

http://www.msisac.org/ : the Multi-State Information Sharing and Analysis Center. The mission of the MS-ISAC is to improve the overall cyber security posture of state, local, territorial and tribal governments.
http://www.onguardonline.gov/ : The mission of this website is to provide practical tips from the federal government and the technology industry to help the end user be on guard against internet fraud, secure their computers, and protect their private personal information.
http://csrc.nist.gov/ : The Computer Security Division (Computer Security Resource Center) of the National Institute of Standards and Technology. Its mission is to provide assistance, guidelines, specifications, minimum information security requirements...

Military agencies:

Homeland Security:

The Department of Homeland Security has a dedicated division responsible for the response system, risk management program and requirements for cyber security in the United States called the National Cyber Security Division.The division is home to US-CERT operations and the National Cyber Alert System. The goals of those team is to :

help government and end-users to transition to new cyber security capabilities
R&D
In October 2009, the Department of Homeland Security opened the National Cybersecurity and Communications Integration Center. The center brings together government organizations responsible for protecting computer networks and networked infrastructure.

FBI:

The third priority of the Federal Bureau of Investigation(FBI) is to:

Protect the United States against cyber-based attacks and high-technology crimes
According to the 2010 Internet Crime Report, 303,809 complaints were received via the IC3 website. The Internet Crime Complaint Center, also known as IC3, is a multi-agency task force made up by the FBI, the National White Collar Crime Center (NW3C), and the Bureau of Justice Assistance (BJA).
According to the same report,here are the top 10 reported offense in the United States only :

1. Non-delivery Payment/Merchandise 14.4%
2. FBI-Related Scams 13.2%
3. Identity Theft 9.8%
4. Computer Crimes 9.1%
5. Miscellaneous Fraud 8.6%
6. Advance Fee Fraud 7.6%
7. Spam 6.9%
8. Auction Fraud 5.9%
9. Credit Card Fraud 5.3%
10. Overpayment Fraud 5.3%

In addition to its own duties, the FBI participates in non-profit organization such as InfraGard. InfraGard is a private non-profit organization serving as a public-private partnership between U.S. businesses and the FBI. The organization describes itself as an information sharing and analysis effort serving the interests and combining the knowledge base of a wide range of members.InfraGard states they are an association of businesses, academic institutions, state and local law enforcement agencies, and other participants dedicated to sharing information and intelligence to prevent hostile acts against the United States.

Department of Justice:

In the criminal division of the United States Department of Justice operates a section called the Computer Crime and Intellectual Property Section. The CCIPS is in charge of investigating computer crime and intellectual property crime and is specialized in the search and seizure of digital evidence in computers and networks.
As stated on their website:

"The Computer Crime and Intellectual Property Section (CCIPS) is responsible for implementing the Department's national strategies in combating computer and intellectual property crimes worldwide. The Computer Crime Initiative is a comprehensive program designed to combat electronic penetrations, data thefts, and cyberattacks on critical information systems. CCIPS prevents, investigates, and prosecutes computer crimes by working with other government agencies, the private sector, academic institutions, and foreign counterparts."
USCYBERCOM:
The United States Strategic Command (USSTRATCOM) is one of the nine Unified Combatant Commands of the United States Department of Defense (DoD). The Command, including components, employs more than 2,700 people, representing all four services, including DoD civilians and contractors, who oversee the command's operationally focused global strategic mission. The United States Cyber Command, also known as USCYBERCOM, is a sub-unified command subordinate to USSTRATCOM. Its mission are to plan, coordinate, integrate, synchronize and conduct activities to: direct the operations and defense of specified Department of Defense information networks and; prepare to, and when directed, conduct full spectrum military cyberspace operations in order to enable actions in all domains, ensure US/Allied freedom of action in cyberspace and deny the same to our adversaries."

FCC:

The U.S. Federal Communications Commission's role in cyber security is to strengthen the protection of critical communications infrastructure, to assist in maintaining the reliability of networks during disasters, to aid in swift recovery after, and to ensure that first responders have access to effective communications services.

Computer Emergency Readiness Team:

Computer Emergency Response Team is a name given to expert groups that handle computer security incidents. In the US, two distinct organization exist, although they do work closely together.

US-CERT: the United States Computer Emergency Response Team is part of the National Cyber Security Division of the United States Department of Homeland Security.
CERT/CC: The Computer Emergency Response Team Coordination Center is a major coordination center created by the Defense Advanced Research Projects Agency (DARPA) and is run by the Software Engineering Institute (SEI).
International actions:
A lot of different teams and organisations exists, mixing private and public members. Here are some examples:

The Forum of Incident Response and Security Teams (FIRST) is the global association of CSIRTs.The US-CERT, AT&T, Apple, Cisco, McAfee, Microsoft are all members of this international team.
The Council of Europe helps protect societies worldwide from the threat of cybercrime through the Convention on Cybercrime and its Protocol on Xenophobia and Racism, the Cybercrime Convention Committee (T-CY) and the Project on Cybercrime.
The purpose of the Messaging Anti-Abuse Working Group (MAAWG) is to bring the messaging industry together to work collaboratively and to successfully address the various forms of messaging abuse, such as spam, viruses, denial-of-service attacks and other messaging exploitations. To accomplish this, MAAWG develops initiatives in the three areas necessary to resolve the messaging abuse problem: industry collaboration, technology, and public policy.France Telecom, Facebook, AT&T, Apple, Cisco, Sprint are some of the members of the MAAWG.
ENISA : The European Network and Information Security Agency (ENISA) is an agency of the European Union. It was created in 2004 by EU Regulation No 460/2004 and is fully operational since September 1, 2005. It has its seat in Heraklion, Crete (Greece).
The objective of ENISA is to improve network and information security in the European Union. The agency has to contribute to the development of a culture of network and information security for the benefit of the citizens, consumers, enterprises and public sector organisations of the European Union, and consequently will contribute to the smooth functioning of the EU Internal Market.

Germany:

Berlin starts National Cyber Defense Initiative:

On June 16, 2011, the German Minister for Home Affairs, officially opened the new German NCAZ (National Center for Cyber Defense) Nationales Cyber-Abwehrzentrum, which is located in Bonn. The NCAZ closely cooperates with BSI (Federal Office for Information Security) Bundesamt für Sicherheit in der Informationstechnik, BKA (Federal Police Organisation) Bundeskriminalamt (Deutschland), BND (Federal Intelligence Service) Bundesnachrichtendienst, MAD (Military Intelligence Service) Amt für den Militärischen Abschirmdienst and other national organisations in Germany taking care of national security aspects. According to the Minister the primary task of the new organisation founded on February 23, 2011, is to detect and prevent attacks against the national infrastructure and mentioned incidents like Stuxnet.

South Korea:

Following cyberattacks in the first half of 2013, whereby government, news-media, television station, and bank websites were compromised, the national government committed to the training of 5,000 new cybersecurity experts by 2017. The South Korean government blamed its northern counterpart on these attacks, as well as incidents that occurred in 2009, 2011, and 2012, but Pyongyang denies the accusations.

Seoul, March 7, 2011 - South Korean police have contacted 35 countries to ask for cooperation in tracing the origin of a massive cyber attack on the Web sites of key government and financial institutions, amid a nationwide cyber security alert issued against further threats. The Web sites of about 30 key South Korean government agencies and financial institutions came under a so-called distributed denial-of-service (DDoS) attack for two days from Friday, with about 50,000 "zombie" computers infected with a virus seeking simultaneous access to selected sites and swamping them with traffic. As soon as the copies of overseas servers are obtained, the cyber investigation unit will analyse the data to track down the origin of the attacks made from countries, including the United States, Russia, Italy and Israel, the NPA noted.

In late September 2013, a computer-security competition jointly sponsored by the defense ministry and the National Intelligence Service was announced. The winners will be announced on September 29, 2013 and will share a total prize pool of 80 million won (US$74,000).

India:

India has no specific law for dealing with cyber security related issues.Some provisions for cyber security have been incorporated into rules framed under the Information Technology Act 2000 but they are grossly insufficient. Further, the National Cyber Security Policy 2013 has remained ineffective and non-implementable until now.The cyber security trends and developments in India 2013 have listed the shortcomings of Indian cyber security policy in general and Indian cyber security initiatives in particular.Indian cyber security policy has also failed to protect civil liberties of Indians including privacy rights.Civil liberties protection in cyberspace has been blatantly ignored by Indian government and e-surveillance projects have been kept intact by the Narendra Modi government.As a result Indian cyber security efforts are inadequate and not up to the mark. There is also no legal obligation for cyber security breach disclosures in India as well.

However, the Indian Companies Act 2013 has introduced cyber law and cyber security obligations on the part of Indian directors. Cyber security obligations for e-commerce business in India have also been recognised recently.

Canada:

On October 3, 2010, Public Safety Canada unveiled Canada’s Cyber Security Strategy, following a Speech from the Throne commitment to boost the security of Canadian cyberspace. The aim of the strategy is to strengthen Canada’s “cyber systems and critical infrastructure sectors, support economic growth and protect Canadians as they connect to each other and to the world.” Three main pillars define the strategy: securing government systems, partnering to secure vital cyber systems outside the federal government, and helping Canadians to be secure online.The strategy involves multiple departments and agencies across the Government of Canada.The Cyber Incident Management Framework for Canada outlines these responsibilities, and provides a plan for coordinated response between government and other partners in the event of a cyber incident.The Action Plan 2010-2015 for Canada's Cyber Security Strategy outlines the ongoing implementation of the strategy.

Public Safety Canada’s Canadian Cyber Incident Response Centre (CCIRC) is responsible for mitigating and responding to threats to Canada’s critical infrastructure and cyber systems. The CCIRC provides support to mitigate cyber threats, technical support to respond and recover from targeted cyber attacks, and provides online tools for members of Canada’s critical infrastructure sectors.The CCIRC posts regular cyber security bulletins on the Public Safety Canada website.The CCIRC also operates an online reporting tool where individuals and organizations can report a cyber incident.Canada's Cyber Security Strategy is part of a larger, integrated approach to critical infrastructure protection, and functions as a counterpart document to the National Strategy and Action Plan for Critical Infrastructure.

On September 27, 2010, Public Safety Canada partnered with STOP.THINK.CONNECT, a coalition of non-profit, private sector, and government organizations dedicated to informing the general public on how to protect themselves online.On February 4, 2014, the Government of Canada launched the Cyber Security Cooperation Program.The program is a $1.5 million five-year initiative aimed at improving Canada’s cyber systems through grants and contributions to projects in support of this objective.Public Safety Canada aims to begin an evaluation of Canada's Cyber Security Strategy in early 2015.Public Safety Canada administers and routinely updates the GetCyberSafe portal for Canadian citizens, and carries out Cyber Security Awareness Month during October.

National teams:

Here are the main computer emergency response teams around the world. Every country have their own team to protect network security. February 27, 2014, the Chinese network security and information technology leadership team is established. The leadership team will focus on national security and long-term development, co-ordination of major issues related to network security and information technology economic, political, cultural, social, and military and other fields of research to develop network security and information technology strategy, planning and major macroeconomic policy promote national network security and information technology law, and constantly enhance security capabilities.

Europe:

CSIRTs in Europe collaborate in the TERENA task force TF-CSIRT. TERENA's Trusted Introducer service provides an accreditation and certification scheme for CSIRTs in Europe. A full list of known CSIRTs in Europe is available from the Trusted Introducer website.

Other countries:

CERT Brazil, member of FIRST (Forum for Incident Response and Security Teams)
CARNet CERT, Croatia, member of FIRST
AE CERT, United Arab Emirates
SingCERT, Singapore
CERT-LEXSI, France, Canada, Singapore
Cybersecurity and modern warfare:
Main article: Cyberwarfare
Cybersecurity is becoming increasingly important as more information and technology is being made available on cyberspace. There is growing concern among governments that cyberspace will become the next theatre of warfare. As Mark Clayton from the Christian Science Monitor described in article titled, “The New Cyber Arms Race.”:

In the future, wars will not just be fought by soldiers with guns or with planes that drop bombs. They will also be fought with the click of a mouse a half a world away that unleashes carefully weaponized computer programs that disrupt or destroy critical industries like utilities, transportation, communications, and energy. Such attacks could also disable military networks that control the movement of troops, the path of jet fighters, the command and control of warships.

This has lead to new terms such as, “cyberwarfare” and “cyberterrorism.” More and more critical infrastructure is being controlled via computer programs that, while increasing efficiency, exposes new vulnerabilities. The test will be to see if governments and corporations that control critical systems such as energy, communications and other critical information will be able to prevent attacks before they occur. As Jay Cross, the chief scientist of the Internet Time Group remarked, “Connectedness begets vulnerability.”

The cyber security job market:

Cyber Security is a fast-growing field of IT concerned with reducing organizations' risk of hack or data breach. Commercial, government and non-governmental all employ cybersecurity professional, but the use of the term "cybersecurity" is government job descriptions is more prevalent than in non-government job descriptions, in part due to government "cybersecurity" initiatives (as opposed to corporation's "IT security" initiatives) and the establishment of government institutions like the US Cyber Command and the UK Defence Cyber Operations Group.

Typical cybersecurity job titles and descriptions include:

Security Analyst

Analyzes and assesses vulnerabilities in the infrastructure (software, hardware, networks), investigates available tools and countermeasures to remedy the detected vulnerabilities, and recommends solutions and best practices. Analyzes and assesses damage to the data/infrastructure as a result of security incidents, examines available recovery tools and processes, and recommends solutions. Tests for compliance with security policies and procedures. May assist in the creation, implementation, and/or management of security solutions.
Security Engineer
 Performs security monitoring, security and data/logs analysis, and forensic analysis, to detect security incidents, and mounts incident response. Investigates and utilizes new technologies and processes to enhance security capabilities and implement improvements. May also review code or perform other security engineering methodologies.
Security Architect
Designs a security system or major components of a security system, and may head a security design team building a new security system.
Security Administrator
Installs and manages organization-wide security systems. May also take on some of the tasks of a security analyst in smaller organizations.
Chief Information Security Officer
A high-level management position responsible for the entire information security division/staff. The position may include hands-on technical work.
Security Consultant/Specialist/Intelligence
Broad titles that encompass any one or all of the other roles/titles, tasked with protecting computers, networks, software, data, and/or information systems against viruses, worms, spyware, malware, intrusion detection, unauthorized access, denial-of-service attacks, and an ever increasing list of attacks by hackers acting as individuals or as part of organized crime or foreign governments.
Student programs are also available to people interested in beginning a career in cybersecurity.Meanwhile, a flexible and effective option for information security professionals of all experience levels to keep studying is online security training, including webcasts.

Terminology:


This section may require cleanup to meet Wikipedia's quality standards. No cleanup reason has been specified. Please help improve this section if you can. (November 2010)
The following terms used with regards to engineering secure systems are explained below.

Access authorization restricts access to a computer to group of users through the use of authentication systems. These systems can protect either the whole computer – such as through an interactive login screen – or individual services, such as an FTP server. There are many methods for identifying and authenticating users, such as passwords, identification cards, and, more recently, smart cards and biometric systems.
Anti-virus software consists of computer programs that attempt to identify, thwart and eliminate computer viruses and other malicious software (malware).
Applications with known security flaws should not be run. Either leave it turned off until it can be patched or otherwise fixed, or delete it and replace it with some other application. Publicly known flaws are the main entry used by worms to automatically break into a system and then spread to other systems connected to it. The security website Secunia provides a search tool for unpatched known flaws in popular products.
Authentication techniques can be used to ensure that communication end-points are who they say they are.
Automated theorem proving and other verification tools can enable critical algorithms and code used in secure systems to be mathematically proven to meet their specifications.
Backups are a way of securing information; they are another copy of all the important computer files kept in another location. These files are kept on hard disks, CD-Rs, CD-RWs, tapes and more recently on the cloud. Suggested locations for backups are a fireproof, waterproof, and heat proof safe, or in a separate, offsite location than that in which the original files are contained. Some individuals and companies also keep their backups in safe deposit boxes inside bank vaults. There is also a fourth option, which involves using one of the file hosting services that backs up files over the Internet for both business and individuals, known as the cloud.
Backups are also important for reasons other than security. Natural disasters, such as earthquakes, hurricanes, or tornadoes, may strike the building where the computer is located. The building can be on fire, or an explosion may occur. There needs to be a recent backup at an alternate secure location, in case of such kind of disaster. Further, it is recommended that the alternate location be placed where the same disaster would not affect both locations. Examples of alternate disaster recovery sites being compromised by the same disaster that affected the primary site include having had a primary site in World Trade Center I and the recovery site in 7 World Trade Center, both of which were destroyed in the 9/11 attack, and having one's primary site and recovery site in the same coastal region, which leads to both being vulnerable to hurricane damage (for example, primary site in New Orleans and recovery site in Jefferson Parish, both of which were hit by Hurricane Katrina in 2005). The backup media should be moved between the geographic sites in a secure manner, in order to prevent them from being stolen.
Capability and access control list techniques can be used to ensure privilege separation and mandatory access control. This section discusses their use.
Chain of trust techniques can be used to attempt to ensure that all software loaded has been certified as authentic by the system's designers.
Confidentiality is the nondisclosure of information except to another authorized person.
Cryptographic techniques can be used to defend data in transit between systems, reducing the probability that data exchanged between systems can be intercepted or modified.
Cyberwarfare is an Internet-based conflict that involves politically motivated attacks on information and information systems. Such attacks can, for example, disable official websites and networks, disrupt or disable essential services, steal or alter classified data, and criple financial systems.
Data integrity is the accuracy and consistency of stored data, indicated by an absence of any alteration in data between two updates of a data record.

Cryptographic techniques involve transforming information, scrambling it so it becomes unreadable during transmission. The intended recipient can unscramble the message; ideally, eavesdroppers cannot.
Encryption is used to protect the message from the eyes of others. Cryptographically secure ciphers are designed to make any practical attempt of breaking infeasible. Symmetric-key ciphers are suitable for bulk encryption using shared keys, and public-key encryption using digital certificates can provide a practical solution for the problem of securely communicating when no key is shared in advance.
Endpoint security software helps networks to prevent exfiltration (data theft) and virus infection at network entry points made vulnerable by the prevalence of potentially infected portable computing devices, such as laptops and mobile devices, and external storage devices, such as USB drives.
Firewalls are an important method for control and security on the Internet and other networks. A network firewall can be a communications processor, typically a router, or a dedicated server, along with firewall software. A firewall serves as a gatekeeper system that protects a company's intranets and other computer networks from intrusion by providing a filter and safe transfer point for access to and from the Internet and other networks. It screens all network traffic for proper passwords or other security codes and only allows authorized transmission in and out of the network. Firewalls can deter, but not completely prevent, unauthorized access (hacking) into computer networks; they can also provide some protection from online intrusion.
Honey pots are computers that are either intentionally or unintentionally left vulnerable to attack by crackers. They can be used to catch crackers or fix vulnerabilities.
Intrusion-detection systems can scan a network for people that are on the network but who should not be there or are doing things that they should not be doing, for example trying a lot of passwords to gain access to the network.
A microkernel is the near-minimum amount of software that can provide the mechanisms to implement an operating system. It is used solely to provide very low-level, very precisely defined machine code upon which an operating system can be developed. A simple example is the early '90s GEMSOS (Gemini Computers), which provided extremely low-level machine code, such as "segment" management, atop which an operating system could be built. The theory (in the case of "segments") was that—rather than have the operating system itself worry about mandatory access separation by means of military-style labeling—it is safer if a low-level, independently scrutinized module can be charged solely with the management of individually labeled segments, be they memory "segments" or file system "segments" or executable text "segments." If software below the visibility of the operating system is (as in this case) charged with labeling, there is no theoretically viable means for a clever hacker to subvert the labeling scheme, since the operating system per se does not provide mechanisms for interfering with labeling: the operating system is, essentially, a client (an "application," arguably) atop the microkernel and, as such, subject to its restrictions.
Pinging The ping application can be used by potential crackers to find if an IP address is reachable. If a cracker finds a computer, they can try a port scan to detect and attack services on that computer.
Social engineering awareness keeps employees aware of the dangers of social engineering and/or having a policy in place to prevent social engineering can reduce successful breaches of the network and servers.

OPERATING SYSTEM: UBUNTU

This article is about the operating system.


Developer Canonical Ltd., Ubuntu community
OS family Linux
Working state Current
Source model Open source
Initial release 20 October 2004; 10 years ago
Latest release 15.04 Vivid Vervet / 23 April 2015; 3 days ago
Marketing target Personal computers, Servers, smartphones, tablet computers (Ubuntu Touch), smart TVs (Ubuntu TV)
Available in More than 55 languages by LoCos
Update method APT (Software Updater, Ubuntu Software Center)
Package manager dpkg, Click packages
Platforms IA-32, x86-64,ARMv7,ARM64, Power
Kernel type Monolithic (Linux)
Userland GNU
Default user interface
Unity (since Ubuntu 11.04)
GNOME (until Ubuntu 10.10)
License Free software licenses
(mainly GPL)
Official website www.ubuntu.com

Ubuntu is a Debian-based Linux operating system, with Unity as its default desktop environment. It is based on free software and named after the Southern African philosophy of ubuntu (literally, "human-ness"), which often is translated as "humanity towards others" or "the belief in a universal bond of sharing that connects all humanity".

Development of Ubuntu is led by UK-based Canonical Ltd.,a company owned by South African entrepreneur Mark Shuttleworth. Canonical generates revenue through the sale of technical support and other services related to Ubuntu.The Ubuntu project is publicly committed to the principles of open-source software development; people are encouraged to use free software, study how it works, improve upon it, and distribute it.

Contents

1 Features
2 Security
3 History and development process
4 Installation
5 Package classification and support
5.1 Third-party software
6 Releases
7 Variants
7.1 Chinese derivative UbuntuKylin
7.2 Ubuntu Server
7.3 Ubuntu Touch
7.4 Cloud computing
8 Adoption and reception
8.1 Installed base
8.2 Large-scale deployments
8.3 Critical reception
8.4 Amazon controversy
9 Local communities (LoCos)
10 Hardware vendor support

Features

A default installation of Ubuntu contains a wide range of software that includes LibreOffice, Firefox, Thunderbird, Transmission, and several lightweight games such as Sudoku and chess.Many additional software packages, including titles no longer in the default installation such as Evolution, GIMP, Pidgin, and Synaptic, are accessible from the built in Ubuntu Software Center as well as any other APT based package management tool. Execution of Microsoft Office and other Microsoft Windows applications can be facilitated via the Wine compatibility package or through the use of a virtual machine such as VirtualBox or VMware Workstation.

Security

Ubuntu's goal is to be secure "out-of-the box". By default, the user's programs run with low privileges and cannot corrupt the operating system or other user's files. For increased security, the sudo tool is used to assign temporary privileges for performing administrative tasks, which allows the root account to remain locked and helps prevent inexperienced users from inadvertently making catastrophic system changes or opening security holes.PolicyKit is also being widely implemented into the desktop to further harden the system. Most network ports are closed by default to prevent hacking.A built-in firewall allows end-users who install network servers to control access. A GUI (GUI for Uncomplicated Firewall) is available to configure it.Ubuntu compiles its packages using GCC features such as PIE and buffer overflow protection to harden its software.These extra features greatly increase security at the performance expense of 1% in 32 bit and 0.01% in 64 bit.

The home and Private directories can be encrypted.

History and development process

Ubuntu is built on Debian's architecture and infrastructure, to provide Linux server, desktop, phone, tablet and TV operating systems.Ubuntu releases updated versions predictably – every six months– and each release receives free support for nine months (eighteen months prior to 13.04)with security fixes, high-impact bug fixes and conservative, substantially beneficial low-risk bug fixes.The first release was in October 2004.

It was decided that every fourth release, issued on a two-year basis, would receive long-term support (LTS).Long-term support includes updates for new hardware, security patches and updates to the 'Ubuntu stack' (cloud computing infrastructure).The first LTS releases were supported for three years on the desktop and five years on the server; since Ubuntu 12.04 LTS, desktop support for LTS releases was increased to five years as well.LTS releases get regular point releases with support for new hardware and integration of all the updates published in that series to date.

Ubuntu packages are based on packages from Debian's unstable branch: both distributions use Debian's deb package format and package management tools (APT and Ubuntu Software Center). Debian and Ubuntu packages are not necessarily binary compatible with each other, however; packages may need to be rebuilt from source to be used in Ubuntu.Many Ubuntu developers are also maintainers of key packages within Debian. Ubuntu cooperates with Debian by pushing changes back to Debian,although there has been criticism that this does not happen often enough. Ian Murdock, the founder of Debian, has expressed concern about Ubuntu packages potentially diverging too far from Debian to remain compatible.Before release, packages are imported from Debian Unstable continuously and merged with Ubuntu-specific modifications. A month before release, imports are frozen, and packagers then work to ensure that the frozen features interoperate well together.

Ubuntu is currently funded by Canonical Ltd. On 8 July 2005, Mark Shuttleworth and Canonical Ltd. announced the creation of the Ubuntu Foundation and provided an initial funding of US$10 million. The purpose of the foundation is to ensure the support and development for all future versions of Ubuntu. Mark Shuttleworth describes the foundation as an "emergency fund" (in case Canonical's involvement ends).

On 12 March 2009, Ubuntu announced developer support for 3rd-party cloud management platforms, such as those used at Amazon EC2.

Unity became the default GUI for Ubuntu Desktop (not included in 10.10).


Installation


Ubuntu running on the Nexus S, a smartphone that ran Android prior to Ubuntu
The system requirements vary among Ubuntu products. For the Ubuntu desktop release 14.04, a PC with at least 768 MB of RAM and 5 GB of disk space is recommended.For less powerful computers, there are other Ubuntu distributions such as Lubuntu and Xubuntu. As of version 12.04, Ubuntu supports the ARM architecture. Ubuntu is also available on PowerPC, and SPARC platforms,although these platforms are not officially supported.

Live images are the typical way for users to assess and subsequently install Ubuntu. These can be downloaded as a disk image (.iso) and subsequently burnt to a DVD and booted, or run via UNetbootin directly from a USB drive (making, respectively, a live DVD or live USB medium). Running Ubuntu in this way is typically slower than running it from a hard drive, but does not alter the computer unless specifically instructed by the user. If the user chooses to boot the live image rather than execute an installer at boot time, there is still the option to then use an installer called Ubiquity to install Ubuntu once booted into the live environment.Disk images of all current and past versions are available for download at the Ubuntu web site.Various third-party programs such as remastersys and Reconstructor are available to create customized copies of the Ubuntu Live DVDs (or CDs). "Minimal CDs" are available (for server use) that fit on a CD.

Additionally, USB flash drive installations can be used to boot Ubuntu and Kubuntu in a way that allows permanent saving of user settings and portability of the USB-installed system between physical machines (however, the computers' BIOS must support booting from USB).In newer versions of Ubuntu, the Ubuntu Live USB creator can be used to install Ubuntu on a USB drive (with or without a live CD or DVD). Creating a bootable USB drive with persistence is as simple as dragging a slider to determine how much space to reserve for persistence; for this, Ubuntu employs casper.

The desktop edition can also be installed using the Netboot image (aka netbook tarball) which uses the debian-installer and allows certain specialist installations of Ubuntu: setting up automated deployments, upgrading from older installations without network access, LVM and/or RAID partitioning, installs on systems with less than about 256 MB of RAM (although low-memory systems may not be able to run a full desktop environment reasonably).

Package classification and support

Ubuntu divides most software into four domains to reflect differences in licensing and the degree of support available.Some unsupported applications receive updates from community members, but not from Canonical Ltd.

Free software Non-free software
Supported Main Restricted
Unsupported Universe Multiverse
Free software includes software that has met the Ubuntu licensing requirements,which roughly correspond to the Debian Free Software Guidelines. Exceptions, however, include firmware and fonts, in the Main category, because although they are not allowed to be modified, their distribution is otherwise unencumbered.

Non-free software is usually unsupported (Multiverse), but some exceptions (Restricted) are made for important non-free software. Supported non-free software includes device drivers that can be used to run Ubuntu on some current hardware, such as binary-only graphics card drivers. The level of support in the Restricted category is more limited than that of Main, because the developers may not have access to the source code. It is intended that Main and Restricted should contain all software needed for a complete desktop environment.Alternative programs for the same tasks and programs for specialized applications are placed in the Universe and Multiverse categories.

In addition to the above, in which the software does not receive new features after an initial release, Ubuntu Backports is an officially recognized repository for backporting newer software from later versions of Ubuntu.The repository is not comprehensive; it consists primarily of user-requested packages, which are approved if they meet quality guidelines. Backports receives no support at all from Canonical, and is entirely community-maintained.

The -updates repository provides stable release updates (SRU) of Ubuntu and are generally installed through update-manager. Each release is given its own -updates repository (e.g. intrepid-updates). The repository is supported by Canonical Ltd. for packages in main and restricted, and by the community for packages in universe and multiverse. All updates to the repository must meet certain requirements and go through the -proposed repository before being made available to the public.Updates are scheduled to be available until the end of life for the release.

In addition to the -updates repository, the unstable -proposed repository contains uploads which must be confirmed before being copied into -updates. All updates must go through this process to ensure that the patch does truly fix the bug and there is no risk of regression.Updates in -proposed are confirmed by either Canonical or members of the community.

Canonical's partner repository lets vendors of proprietary software deliver their products to Ubuntu users at no cost through the same familiar tools for installing and upgrading software.The software in the partner repository is officially supported with security and other important updates by its respective vendors. Canonical supports the packaging of the software for Ubuntu and provides guidance to vendors.The partner repository is disabled by default and can be enabled by the user.Some popular products distributed via the partner repository as of 28 April 2013 are Adobe Flash Player, Adobe Reader and Skype.

Third-party software

Ubuntu has a certification system for third-party software.Some third-party software that does not limit distribution is included in Ubuntu's multiverse component. The package ubuntu-restricted-extras additionally contains software that may be legally restricted, including support for MP3 and DVD playback, Microsoft TrueType core fonts, Sun's Java runtime environment, Adobe's Flash Player plugin, many common audio/video codecs, and unrar, an unarchiver for files compressed in the RAR file format.

Additionally, third party application suites are available for purchase through Ubuntu Software Center,including many high-quality games such as Braid and Oil Rush,software for DVD playback and media codecs.

Steam is also available for Ubuntu with a wide range of indie games, such as Amnesia: The Dark Descent, as well as some AAA titles, such as Counter-Strike: Source and Half-Life 2.

Releases

For more details on all Ubuntu releases including older ones not covered here, see List of Ubuntu releases.
Version Code name Release date Supported until
Desktop Server
10.04 LTS Lucid Lynx 2010-04-29 2013-05-09 2015-04-30
10.10 Maverick Meerkat 2010-10-10 2012-04-10
11.04 Natty Narwhal 2011-04-28 2012-10-28
11.10 Oneiric Ocelot 2011-10-13 2013-05-09
12.04 LTS Precise Pangolin 2012-04-26 2017-04
12.10 Quantal Quetzal 2012-10-18 2014-05-16
13.04 Raring Ringtail 2013-04-25 2014-01-27
13.10 Saucy Salamander 2013-10-17 2014-07-17
14.04 LTS           Trusty Tahr 2014-04-17 2019-04
14.10 Utopic Unicorn          2014-10-23        2015-07
15.04 Vivid Vervet             2015-04-23 2016-01

Old versionOlder version, still supportedLatest versionFuture release
Each Ubuntu release has a version number that consists of the year and month number of the release.For example, the first release was Ubuntu 4.10 as it was released on 20 October 2004. Version numbers for future versions are provisional; if the release is delayed the version number changes accordingly.

Ubuntu releases are also given alliterative code names, using an adjective and an animal (e.g., "Trusty Tahr" and "Precise Pangolin"). With the exception of the first two releases, code names are in alphabetical order, allowing a quick determination of which release is newer. "We might skip a few letters, and we'll have to wrap eventually." says Mark Shuttleworth while describing the naming scheme.Commonly, Ubuntu releases are referred to using only the adjective portion of the code name; for example, the 14.04 LTS release is commonly known as "Trusty".

Releases are timed to be approximately one month after GNOME releases (which in turn are about one month after releases of X.org). As a result, every Ubuntu release was introduced with an updated version of both GNOME and X. After each release, the Ubuntu Developer Summit (UDS) is held, at which the Ubuntu community sets the development direction for the next cycle.

Upgrades between releases have to be done from one release to the next release (e.g. Ubuntu 13.10 to Ubuntu 14.04) or from one LTS release to the next LTS release (e.g. Ubuntu 12.04 LTS to Ubuntu 14.04 LTS).

Ubuntu 10.10 (Maverick Meerkat), was released on 10 October 2010 (10-10-10). This departed from the traditional schedule of releasing at the end of October in order to get "the perfect 10",and makes a playful reference to The Hitchhiker's Guide to the Galaxy books, since, in binary, 101010 equals decimal 42, the "Answer to the Ultimate Question of Life, the Universe and Everything" within the series.

A DVD or bootable flash drive of 1 GB or more is required for Ubuntu 12.10 and later.Server releases still fit on CDs.

Variants
Ubuntu family tree

The variant officially recommended for most users, and officially supported by Canonical, is Ubuntu Desktop (formally named as Ubuntu Desktop Edition, and simply called Ubuntu), designed for desktop and laptop PCs using Unity Desktop interface (earlier versions used GNOME).A number of other variants are distinguished simply by each featuring a different desktop environment: Ubuntu GNOME (with the GNOME desktop environment), Ubuntu MATE (with the MATE desktop environment), Kubuntu (with KDE Plasma Workspaces), Lubuntu (with LXDE), and Xubuntu (with Xfce). LXDE and Xfce are sometimes recommended for use with older PCs that may have less memory and processing power available.

These four are not commercially supported by Canonical.

Besides Ubuntu Desktop, there are several other official Ubuntu editions, which are created and maintained by Canonical and the Ubuntu community and receive full support from Canonical, its partners and the Community. They include the following:

Ubuntu Business Desktop Remix, a release meant for business users that comes with special enterprise software including Adobe Flash, Canonical Landscape, OpenJDK 6 and VMware View, while removing social networking and file sharing applications, games and development/sysadmin tools.The goal of the Business Desktop Remix is not to copy other enterprise-oriented distributions, such as Red Hat Enterprise Linux, but to make it, according to Mark Shuttleworth's blog, "easier for institutional users to evaluate Ubuntu Desktop for their specific needs."

By Precise Pangolin (12.04), Kubuntu is a community-supported variant of the Ubuntu distribution which uses the KDE Plasma Workspaces.
Ubuntu TV, labeled "TV for human beings" by Canonical, was introduced at the 2012 Consumer Electronics Show by Canonical's marketing executive John D. Bernard.Created for smart TVs, Ubuntu TV aimed to provide access to popular Internet services and stream content to mobile devices running Android, iOS and Ubuntu.Launchpad.net has not shown any development activity since late 2011.
There are many Ubuntu variants (or derivatives) based on the official Ubuntu editions. These Ubuntu variants install a default set of packages that differ from the official Ubuntu distributions.

The variants recognized by Canonical as contributing significantly towards the Ubuntu project are the following:

Edubuntu, a subproject and add-on for Ubuntu, designed for school environments and home users.
Mythbuntu, designed for creating a home theater PC with MythTV and uses the Xfce desktop environment.
Ubuntu Studio, a distribution made for professional video and audio editing, comes with higher-end free editing software.
Edubuntu, Mythbuntu and Ubuntu Studio are not commercially supported by Canonical.

Other variants are created and maintained by individuals and organizations outside of Canonical, and they are self-governed projects that work more or less closely with the Ubuntu community.

Chinese derivative UbuntuKylin

As of Ubuntu 10.10, a Chinese version of Ubuntu Desktop called "Ubuntu Chinese Edition", had been released alongside the various other editions.However, in 2013, Canonical reached an agreement with the Ministry of Industry and Information Technology of the People's Republic of China to make Ubuntu the new basis of the Kylin operating system starting with Raring Ringtail (version 13.04). The first version of Ubuntu Kylin was released on 25 April 2013.

Ubuntu Server

Ubuntu has a server edition that uses the same APT repositories as the Ubuntu Desktop Edition. The differences between them are the absence of an X Window environment in a default installation of the server edition (although one can easily be installed including Unity, GNOME, KDE or XFCE) and the installation process.The server edition uses a screen mode character-based interface for the installation, instead of a graphical installation process.

Ubuntu 10.04 Server Edition can also run on VMware ESX Server, Oracle's VirtualBox and VM, Citrix Systems XenServer hypervisors, Microsoft Hyper-V, QEMU, Kernel-based Virtual Machine, or any other IBM PC compatible emulator or virtualizer. Ubuntu 10.04 turns on AppArmor (security module for the Linux kernel) by default on key software packages, and the firewall is extended to common services used by the operating system.

Ubuntu 12.04 LTS Server Edition supports three major architectures: IA-32, x86-64 and ARM.Minimum RAM memory requirements are 128 MB.

Ubuntu 14.04 LTS Server Edition includes MySQL 5.5, OpenJDK 7, Samba 4.1, PHP 5.5, Python 2.7.

Ubuntu Touch:

For more details on this topic, see Ubuntu Touch.
Ubuntu Touch is an alternate version of Ubuntu developed for smartphones and tablets which was announced on 2 January 2013. Ubuntu Touch was Released To Manufacturing on 16 September 2014.The first device to run it was the Galaxy Nexus.A demo version for higher-end Ubuntu smartphones was shown that could run a full Ubuntu desktop when connected to a monitor and keyboard, which was to ship as Ubuntu for Android.A concept for a smartphone running Ubuntu for Phones was published on Ubuntu's official channel on YouTube. The platform allows developing one app with two interfaces: a smartphone UI, and, when docked, a desktop UI.Ubuntu for Tablets was previewed at 19 February 2013. According to the keynote video, an Ubuntu Phone will be able to connect to a tablet, which will then utilize a tablet interface; plugging a keyboard and mouse into the tablet will transform the phone into a desktop; and plugging a television monitor into the phone will bring up the Ubuntu TV interface.

On 6 February 2015, the first smartphone running Ubuntu "out of the box" was announced. The BQ Aquaris E4.5 Ubuntu Edition features a 4.5 inch qHD display, a 1.3 GHz quad-core Cortex-A7 processor, and 1 GB of RAM. It will be priced at 169.90 euros.

Cloud computing:


Eucalyptus interface

Ubuntu offers Ubuntu Cloud Images which are pre-installed disk images that have been customized by Ubuntu engineering to run on cloud-platforms such as Amazon EC2, OpenStack, Windows and LXC.Ubuntu is also prevalent on VPS platforms such as DigitalOcean.

Ubuntu 11.04 added support for OpenStack, with Eucalyptus to OpenStack migration tools added by Canonical in Ubuntu Server 11.10.Ubuntu 11.10 added focus on OpenStack as the Ubuntu's preferred IaaS offering though Eucalyptus is also supported. Another major focus is Canonical Juju for provisioning, deploying, hosting, managing, and orchestrating enterprise data center infrastructure services, by, with, and for the Ubuntu Server.

Adoption and reception:

Installed base:

Because of a lack of registration, any number provided for Ubuntu usage can only be estimated.In fall 2011, Canonical estimated that Ubuntu had more than 20 million users worldwide.

W3Techs estimated in October 2013 that:

Ubuntu is used by 26.1% of all Linux websites, behind only Debian (on which Ubuntu is based), which is used by 32.7% of all Linux websites.
Ubuntu is the most popular Linux distribution among the top 1000 sites and gains around 500 of the top 10 million websites per day.
Ubuntu is used by 8.2% of all websites analyzed, growing from less than 7% in October 2012.
W3Techs only analyses the top 10 million websites.It considers Linux as a subcategory of Unix and estimated in the same month that 66.4% of the analyzed websites use Unix.

According to thecloudmarket.com, Ubuntu is on at least 54% of the images it scanned on Amazon EC2.

Wikimedia data (based on User agent) for September 2013 shows that Ubuntu generated the most page requests to Wikimedia among recognizable Linux distributions.


Large-scale deployments:

The public sector has also adopted Ubuntu. As of January 2009, the Ministry of Education and Science of Republic of Macedonia deployed more than 180,000 Ubuntu-based classroom desktops, and has encouraged every student in the country to use Ubuntu-powered computer workstations;the Spanish school system has 195,000 Ubuntu desktops.The French police, having already started using open-source software in 2005 by replacing Microsoft Office with OpenOffice.org, decided to transition to Ubuntu from Windows XP after the release of Windows Vista in 2006.By March 2009, the Gendarmerie Nationale had already switched 5000 workstations to Ubuntu.Based on the success of that transition, it planned to switch 15,000 more over by the end of 2009 and to have switched all 90,000 workstations over by 2015 (GendBuntu project).Lt. Colonel Guimard announced that the move was very easy and allowed for a 70% saving on the IT budget without having to reduce its capabilities.In 2011, Ubuntu 10.04 was adopted by the Indian justice system.The Government of Kerala adopted Ubuntu for the legislators in Kerala and the government schools of Kerala began to use customized IT@School Project Ubuntu 10.04 which contains specially created software for students. Earlier, Windows was used in the schools. Textbooks were also remade with Ubuntu syllabus and is currently used in schools.

The city of Munich, Germany has forked Kubuntu 10.04 LTS and created LiMux for use on the city's computers.After originally planning to migrate 12,000 desktop computer to LiMux, it was announced in December 2013 that the project had completed successfully with the migration of 14,800 out of 15,500 desktop computers.In March 2012, the government of Iceland launched a project to get all public institutions using free and open-source software. Already several government agencies and schools have adopted Ubuntu. The government cited cost savings as a big factor for the decision, and also stated that open source software avoids vendor lock-in. A 12-month project was launched to migrate the biggest public institutions in Iceland to open-source, and help ease the migration for others.Incumbent U.S. president Barack Obama's successful campaign for re-election in 2012 used Ubuntu in its IT department.In August 2014, the city of Turin, Italy, announced the migration from Windows XP to Ubuntu for its 8,300 desktop computers used by the municipality, becoming the first city in Italy to adopt Ubuntu.

Critical reception:

Ubuntu was awarded the Reader Award for best Linux distribution at the 2005 LinuxWorld Conference and Expo in London,received favorable reviews in online and print publications,and has won InfoWorld's 2007 Bossie Award for Best Open Source Client OS.In early 2008 PC World named Ubuntu the "best all-around Linux distribution available today", though it criticized the lack of an integrated desktop effects manager.Chris DiBona, the program manager for open-source software at Google, said “I think Ubuntu has captured people’s imaginations around the Linux desktop,” and “If there is a hope for the Linux desktop, it would be them”. As of January 2009, almost half of Google’s 20,000 employees used a slightly modified version of Ubuntu.

In 2008, Jamie Hyneman, co-host of the American television series Mythbusters, advocated Linux (giving the example of Ubuntu) as a solution to software bloat.Other celebrity users of Ubuntu include science fiction writer Cory Doctorow and actor Stephen Fry.

In March 2013, Canonical announced that it had decided to develop Mir,reversing an earlier plan to move to Wayland as the primary Ubuntu display server and causing widespread objection from the open source desktop community. X.Org contributor Daniel Stone opined: "I'm just irritated that this means more work for us, more work for upstream developers, more work for toolkits, more work for hardware vendors...." In September 2013, an Intel developer removed XMir support from their video driver and wrote "We do not condone or support Canonical in the course of action they have chosen, and will not carry XMir patches upstream".

In January 2014, the UK's authority for computer security, CESG, reported that Ubuntu 12.04 LTS was "the only operating system that passes as many as 9 out of 12 requirements without any significant risks".

As of 2015, Ubuntu's page on DistroWatch is the second most accessed among Linux distribution pages there, behind Linux Mint.

Ubuntu's developers acknowledged battery life problems from version 10.04 and sought to solve the issues of power consumption in the 12.04 LTS release.The 14.04 release improved the situation, but still lagged other operating systems in the battery life metric.

Amazon controversy:

One of the new features of Unity in Ubuntu 12.10 was the shopping lens - Amazon search results displayed in the Unity dash. It was alternately described as the "Amazon controversy","privacy fiasco" and "spyware".

As of October 2012, it sent (through a secure HTTPS connection) the user's queries from the home lens to productsearch.ubuntu.com,which then polled Amazon.com to find relevant products; Amazon then sent product images directly to the user's computer through HTTP (this changed in September 2013). If the user clicked in one of these results and then bought something, Canonical got a small fraction of the sale.

In 2012, many reviewers criticized it: as the home lens is the natural means to search for content on the local machine, reviewers were concerned about the disclosure of queries that were intended to be local, creating a privacy problem.As the feature is active by default (instead of opt-in), many users could be unaware of it.

Some users chose to turn it off or to remove the feature using a patch.An April 2014 article by Scott Gilbertson stated that the online search components of Ubuntu could be turned off with a couple of clicks in version 14.04. The feature may be changed to opt-in in a future release.

For the move, it was awarded the 2013 Austria Big Brother Award.

Local communities (LoCos):

Not to be confused with Linux User Group.
In an effort to reach out to users who are less technical, and to foster a sense of community around the distribution, Local Communities,better known as "LoCos", have been established throughout the world. Originally, each country had one LoCo Team. However, in some areas, most notably the United States and Canada, each state or province may establish a team. A LoCo Council approves teams based upon their efforts to aid in either the development or the promotion of Ubuntu.


Hardware vendor support:

Ubuntu works closely with OEMs to jointly make Ubuntu available on a wide range of devices.A number of vendors offer computers with Ubuntu pre-installed, including Dell,Hasee,Sharp Corporation,Specifically, Dell offers the XPS 13 Laptop™, Developer Edition with Ubuntu pre-installed.Together, Dell, Lenovo, HP, and ASUS offer over 200 desktop and close to 500 laptop PCs preloaded with Ubuntu. Certified OEM images are also available for Ubuntu Advantage customers.System76,WeWi and Tesco. System76 PCs are sold exclusively with Ubuntu. Dell and System76 customers are able to choose between 30-day, three-month, and yearly Ubuntu support plans through Canonical.Dell computers (running Ubuntu 10.04) include extra support for ATI Video Graphics, Dell Wireless, Fingerprint Readers, HDMI, Bluetooth, DVD playback (using LinDVD), and MP3/WMA/WMV.Asus is also selling some Asus Eee PCs with Ubuntu pre-installed and announced that "many more" Eee PC models running Ubuntu for 2011. Vodafone has made available a notebook for the South-African market called "Webbook".

Dell sells computers (initially Inspiron 14R and 15R laptops) pre-loaded with Ubuntu in India and China, with 850 and 350 retail outlets respectively. Starting in 2013 Alienware began offering its X51 model gaming desktop pre-installed with Ubuntu at a lower price than if it were pre-installed with Windows