heidibrayer (2976759) writes "Every day, analysts at major anti-virus companies and research organizations are inundated with new malware samples. From Flame to lesser-known strains, figures indicate that the number of malware samples released each day continues to rise. In 2011, malware authors unleashed approximately 70,000 new strains per day, according to figures reported by Eugene Kapersky. The following year, McAfee reported that 100,000 new strains of malware were unleashed each day. An article published in the October 2013 issue of IEEE Spectrum, updated that figure to approximately 150,000 new malware strains. Not enough manpower exists to manually address the sheer volume of new malware samples that arrive daily in analysts’ queues. In our work here at CERT, we felt that analysts needed an approach that would allow them to them to identify and focus first on the most destructive binary files. This blog post is a follow up of my earlier post entitled Prioritizing Malware Analysis. In this post, we describe the results of the research I conducted with fellow researchers at the Carnegie Mellon University (CMU) Software Engineering Institute (SEI) and CMU’s Robotics Institute highlighting our analysis that demonstrated the validity (with 98 percent accuracy) of our approach, which helps analysts distinguish between the malicious and benign nature of a binary file." Link to Original Source top
A Method to Identify Gaps in Security at International Mail Processing Centers
heidibrayer (2976759) writes "In October 2010, two packages from Yemen containing explosives were discovered on U.S.-bound cargo planes of two of the largest worldwide shipping companies, UPS and FedEx, according to reports by CNN and the Wall Street Journal. The discovery highlighted a long-standing problem—securing international cargo—and ushered in a new area of concern for such entities as the United States Postal Inspection Service (USPIS) and the Universal Postal Union (UPU), a specialized agency of the United Nations that regulates the postal services of 192 member countries. In early 2012, the UPU and several stakeholder organizations developed two standards to improve security in the transport of international mail and to improve the security of critical postal facilities. As with any new set of standards, however, a mechanism was needed to enable implementation of the standards and measure compliance to them. This blog post describes the method developed by researchers in the CERT Division at Carnegie Mellon University’s Software Engineering Institute, in conjunction with the USPIS, to identify gaps in the security of international mail processing centers and similar shipping and transportation processing facilities." Link to Original Source top
Social Engineering and Unintentional Insider Threat
heidibrayer (2976759) writes "Social engineering involves the manipulation of individuals to get them to unwittingly perform actions that cause harm or increase the probability of causing future harm, which we call “unintentional insider threat.” This blog post highlights recent research that aims to add to the body of knowledge about the factors that lead to unintentional insider threat (UIT) and about how organizations in industry and government can protect themselves. This research is part of an ongoing body of work on social engineering and UIT conducted by the CERT Insider Threat Center at the Carnegie Mellon University Software Engineering Institute." Link to Original Source top
What's the Best IT Risk Assessment Method for Your Organization?
heidibrayer (2976759) writes "In this podcast, Ben Tomhave and Erik Heidt, research directors with Gartner Technical Professionals, discuss methods for IT risk assessment and analysis and comparison factors for selecting the methods that are the best fit for your organization." Link to Original Source top
A New Approach to Protecting Critical Information Systems
heidibrayer (2976759) writes "The source of a recent Target security breach that allowed intruders to gain access to more than 40 million credit and debit cards of customers between Nov. 27 and Dec. 14, 2013, has been traced to a heating, ventilation, and air conditioning (HVAC) service sub-contractor in Sharpsburg, Pa., just outside of Pittsburgh, according to a Feb. 5 post on a Wall Street Journal blog. The post stated that the intruders were able to gain access to Target’s system after stealing login credentials from one of Target’s HVAC subcontractors, who had been given remote access. This breach demonstrates how any vulnerability in a critical information system can be exploited to disrupt or harm the normal operation of any commercial or industrial sector. In this blog post, we will present a tool we have developed that increases a security incident responder’s ability to assess risk and identify the appropriate incident response plan for critical information systems." Link to Original Source top
heidibrayer (2976759) writes "At Flickr, the video- and photo-sharing website, the live software platform is updated at least 10 times a day. Flickr accomplishes this through an automated testing cycle that includes comprehensive unit testing and integration testing at all levels of the software stack in a realistic staging environment. If the code passes, it is then tagged, released, built, and pushed into production. This type of lean organization, where software is delivered on a continuous basis, is exactly what the agile founders envisioned when crafting their manifesto: a nimble, stream-lined process for developing and deploying software into the hands of users while continuously integrating feedback and new requirements. A key to Flickr’s prolific deployment is DevOps, a software development concept that literally and figuratively blends development and operations staff and tools in response to the increasing need for interoperability. This blog post, the first in a series, introduces DevOps and explores its impact from an internal perspective on our own software development practices and through the lens of its impact on the software community at large." Link to Original Source top
heidibrayer (2976759) writes "Although the CERT Secure Coding Team has developed secure coding rules and guidelines for Java, prior to 2013 we had not developed a set of secure coding rules that were specific to Java’s application in the Android platform. Android is an important area to focus on, given its mobile device market dominance (82 percent of worldwide market share in the third quarter of 2013) as well as the adoption of Android by the Department of Defense. This blog post, the first in a series, discusses the initial development of our Android rules and guidelines. This initial development included mapping our existing Java secure coding rules and guidelines to Android applicability and also the creation of new Android- only rules for Java secure coding." Link to Original Source top
The Importance of Automated Testing in Open Systems Architectures
heidibrayer (2976759) writes "The Better Buying Power 2.0 initiative is a concerted effort by the United States Department of Defense to achieve greater efficiencies in the development, sustainment, and recompetition of major defense acquisition programs through cost control, elimination of unproductive processes and bureaucracy, and promotion of open competition. This SEI blog posting describes how the Navy is operationalizing Better Buying Power in the context of their Open Systems Architecture and Business Innovation initiatives. This posting also presents the results from a recent online war game that underscore the importance of automated testing in these initiatives to help avoid common traps and pitfalls of earlier cost containment measures." Link to Original Source top
What Are Your Thoughts on the Value of Code Cloning?
heidibrayer (2976759) writes "As a software engineering practice there has been significant debate about the value of code cloning. Code clones are implementation patterns transferred from program to program via copy mechanisms including cut-and-paste, copy-and-paste, and code-reuse.
In its most basic form, code cloning may involve a codelet (snippets of code) that undergoes various forms of evolution, such as slight modification in response to problems. Software reuse quickens the production cycle for augmented functions and data structures. So, if a programmer copies a codelet from one file into another with slight augmentations, a new clone has been created stemming from a founder codelet. Events like these constitute the provenance or historical record of all events affecting a codelet object. In a recent post on the SEI blog, I research that aims to understand the evolution of source and machine code and, eventually, create a model that can recover relationships between codes, files, or executable formats where the provenance is not known.
Please read about our research and let me know if you have any comments or questions on our work by leaving feedback in the comments section of the post." Link to Original Source top
Security Pattern Assurance Through Round-Trip Engineering
heidibrayer (2976759) writes "The process of designing and analyzing software architectures is complex. Architectural design is a minimally constrained search through a vast multi-dimensional space of possibilities. The end result is that architects are seldom confident that they have done the job optimally, or even satisfactorily. Over the past two decades, practitioners and researchers have used architectural patterns to expedite sound software design. Architectural patterns are prepackaged chunks of design that provide proven structural solutions for achieving particular software system quality attributes, such as scalability or modifiability. While use of patterns has simplified the architectural design process somewhat, key challenges remain. This blog explores these challenges and our solutions for achieving system security qualities through use of patterns." Link to Original Source top
Using Scenario-Based Architecture Analysis to Inform Code Quality Measures
heidibrayer (2976759) writes "As the pace of software delivery increases, organizations need guidance on how to deliver high-quality software rapidly, while simultaneously meeting demands related to time-to-market, cost, productivity, and quality. In practice, demands for adding new features or fixing defects often take priority. However, when software developers are guided solely by project management measures, such as progress on requirements and defect counts, they ignore the impact of architectural dependencies, which can impede the progress of a project if not properly managed. In previous posts on this blog, my colleague Ipek Ozkaya and I have focused on architectural technical debt, which refers to the rework and degraded quality resulting from overly hasty delivery of software capabilities to users. This blog post describes a first step towards an approach we developed that aims to use qualitative architectural measures to better inform quantitative code quality metrics." Link to Original Source top
What Role Does Software Play in Unintended Acceleration in Vehicles?
heidibrayer (2976759) writes "Safety-critical avionics, aerospace, medical, and automotive systems are becoming increasingly reliant on software. Malfunctions in these systems can have significant consequences including mission failure and loss of life. So, they must be designed, verified, and validated carefully to ensure that they comply with system specifications and requirements and are error free. This blog post describes an effort at the SEI that aims to help engineers use time-proven architecture patterns (such as the publish-subscribe pattern or correct use of shared resources) and validate their correct application." Link to Original Source top
An Approach to Addressing the Software Engineering Challenges of Big Data
heidibrayer (2976759) writes "In this episode, Ian Gorton and John Klein discuss big data and the challenges it presents for software engineers. With help from fellow SEI researchers, the two have developed a lightweight risk reduction approach to help software engineers manage the challenges of big data. Called Lightweight Evaluation and Architecture Prototyping (for Big Data), the approach is based on principles drawn from proven architecture and technology analysis and evaluation techniques to help the Department of Defense (DoD) and other enterprises including avionics, communications, and healthcare develop and evolve systems to manage big data." Link to Original Source top
heidibrayer (2976759) writes "In this blog post, the fourth installment in a series on Agile readiness, SEI researcher Suzanne Miller focuses on helping organizations understand the risks involved when contemplating or embarking on the adoption of new practices, in this case Agile methods. This latest installment explores project and customer environment, one of many challenging factors to assess when considering Agile adoption readiness." Link to Original Source top
heidibrayer (2976759) writes "This blog post introduces three variants on the V model of system or software development that make it more useful to testers, quality engineers, and other stakeholders interested in the use of testing as a verification and validation method." Link to Original Source top
heidibrayer (2976759) writes "In this podcast, Eric Werner discusses research that he and several of his colleagues are conducting to help software developers create systems for the many-core central processing units in massively parallel computing environments. Eric and his team are creating a software library that can exploit the heterogeneous parallel computers of the future and allow developers to create systems that are more efficient at computation and power consumption." Link to Original Source top
heidibrayer (2976759) writes "Ian Gorton and John Klein, both senior members of the technical staff at the Carnegie Mellon University Software Engineering Institute, discuss the software engineering challenges of big data. In their post, Gorton and Klein describe a lightweight risk reduction approach that they developed aimed at helping the Department of Defense and other enterprises develop and evolve big data systems. The approach, known as the Lightweight Evaluation and Architecture Prototyping (for Big Data), or LEAP (4BD), draws upon proven architecture and technology analysis and evaluation techniques." Link to Original Source top
Architecture Analysis & Design Languge: The Initial Foundations
heidibrayer (2976759) writes "When life- and safety-critical systems fail (and this happens in many domains), the results can be dire, including loss of property and life. These types of systems are increasingly prevalent, and can be found in the altitude and control systems of a satellite, the software-reliant systems of a car (such as its cruise control and anti-lock braking system), or medical devices that emit radiation. When developing such systems, software and systems architects must balance the need for stability and safety with stakeholder demands and time-to-market constraints." Link to Original Source