In
software engineering
Software engineering is a branch of both computer science and engineering focused on designing, developing, testing, and maintaining Application software, software applications. It involves applying engineering design process, engineering principl ...
, structured analysis (SA) and structured design (SD) are methods for analyzing business
requirements
In engineering, a requirement is a condition that must be satisfied for the output of a work effort to be acceptable. It is an explicit, objective, clear and often quantitative description of a condition to be satisfied by a material, design, pro ...
and developing
specification
A specification often refers to a set of documented requirements to be satisfied by a material, design, product, or service. A specification is often a type of technical standard.
There are different types of technical or engineering specificati ...
s for converting practices into
computer program
A computer program is a sequence or set of instructions in a programming language for a computer to Execution (computing), execute. It is one component of software, which also includes software documentation, documentation and other intangibl ...
s, hardware configurations, and related manual procedures.
Structured analysis and design techniques are fundamental tools of
systems analysis
Systems analysis is "the process of studying a procedure or business to identify its goal and purposes and create systems and procedures that will efficiently achieve them". Another view sees systems analysis as a problem-solving technique that ...
. They developed from classical systems analysis of the 1960s and 1970s.
Objectives of structured analysis
Structured analysis became popular in the 1980s and is still in use today. Structured analysis consists of interpreting the
system
A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its open system (systems theory), environment, is described by its boundaries, str ...
concept (or real world situations) into data and control terminology represented by
data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s. The flow of data and control from bubble to the data store to bubble can be difficult to track and the number of bubbles can increase.
One approach is to first define events from the outside world that require the system to react, then assign a bubble to that event. Bubbles that need to interact are then connected until the system is defined. Bubbles are usually grouped into higher level bubbles to decrease complexity.
Data dictionaries are needed to describe the data and command flows, and a process specification is needed to capture the transaction/transformation information.
[FAA (2000)]
''FAA System Safety Handbook, Appendix D''
December 30, 2000.
SA and SD are displayed with
structure charts,
data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s and
data model diagrams, of which there were many variations, including those developed by
Tom DeMarco,
Ken Orr,
Larry Constantine
Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
,
Vaughn Frick,
Ed Yourdon, Steven Ward,
Peter Chen
Peter Pin-Shan Chen (; born 3 January 1947) is a Taiwanese-American computer scientist and applied mathematician. He is a retired distinguished career scientist and faculty member at Carnegie Mellon University and Distinguished Chair Professor E ...
, and others.
These techniques were combined in various published
system development methodologies, including
structured systems analysis and design method
Structured systems analysis and design method (SSADM) is a systems approach to the analysis and design of information systems. SSADM was produced for the Central Computer and Telecommunications Agency, a UK government office concerned with the u ...
, profitable information by design (PRIDE), Nastec structured analysis & design, SDM/70 and the Spectrum structured system development methodology.
History
Structured analysis is part of a series of structured methods that represent a collection of analysis, design, and programming techniques that were developed in response to the problems facing the software world from the 1960s to the 1980s. In this timeframe most commercial programming was done in
Cobol
COBOL (; an acronym for "common business-oriented language") is a compiled English-like computer programming language designed for business use. It is an imperative, procedural, and, since 2002, object-oriented language. COBOL is primarily ...
and
Fortran, then
C and
BASIC
Basic or BASIC may refer to:
Science and technology
* BASIC, a computer programming language
* Basic (chemistry), having the properties of a base
* Basic access authentication, in HTTP
Entertainment
* Basic (film), ''Basic'' (film), a 2003 film
...
. There was little guidance on "good" design and programming techniques, and there were no standard techniques for documenting requirements and designs. Systems were getting larger and more complex, and the information system development became harder and harder to do so."
[Dave Levitt (2000). "Introduction to Structured Analysis and Design." at ''faculty.inverhills.edu/dlevitt''. Retrieved 21 Sep 2008. No longer online 2017.]
As a way to help manage large and complex software, the following structured methods emerged since the end of the 1960s:
*
Structured programming Structured programming is a programming paradigm aimed at improving the clarity, quality, and development time of a computer program by making specific disciplined use of the structured control flow constructs of selection ( if/then/else) and repet ...
in circa 1967 with
Edsger Dijkstra
Edsger Wybe Dijkstra ( ; ; 11 May 1930 – 6 August 2002) was a Dutch computer scientist, programmer, software engineer, mathematician, and science essayist.
Born in Rotterdam in the Netherlands, Dijkstra studied mathematics and physics and the ...
- "Go To Statement Considered Harmful"
*
Niklaus Wirth
Niklaus Emil Wirth ( IPA: ) (15 February 1934 – 1 January 2024) was a Swiss computer scientist. He designed several programming languages, including Pascal, and pioneered several classic topics in software engineering. In 1984, he won the Tu ...
Stepwise design in 1971
*
Nassi–Shneiderman diagram in 1972
*
Warnier/Orr diagram in 1974 - "Logical Construction of Programs"
*
HIPO in 1974 - IBM Hierarchy input-process-output (though this should really be output-input-process)
*
Structured design around 1975 with
Larry Constantine
Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
,
Ed Yourdon and
Wayne Stevens.
*
Jackson structured programming
Jackson structured programming (JSP) is a method for structured programming developed by British software consultant Michael A. Jackson (computer scientist), Michael A. Jackson and was described in his 1975 book ''Principles of Program Design''.. ...
in circa 1975 developed by
Michael A. Jackson
* Structured analysis in circa 1978 with
Tom DeMarco,
Edward Yourdon
Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis tec ...
, Gane & Sarson, McMenamin & Palmer.
*
Structured analysis and design technique (SADT) developed by
Douglas T. Ross
*
Yourdon structured method developed by
Edward Yourdon
Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis tec ...
.
* Structured analysis and system specification published in 1978 by
Tom DeMarco.
*
Structured systems analysis and design method
Structured systems analysis and design method (SSADM) is a systems approach to the analysis and design of information systems. SSADM was produced for the Central Computer and Telecommunications Agency, a UK government office concerned with the u ...
(SSADM) first presented in 1983 developed by the
UK Office of Government Commerce
The Office of Government Commerce (OGC) was a Government of the United Kingdom, UK Government Office established as part of HM Treasury in 2000. It was moved into the Efficiency and Reform Group of the Cabinet Office in 2010, before being closed ...
.
*
Essential Systems Analysis, proposed by Stephen M. McMenamin and John F. Palmer
*
IDEF0
IDEF0, a compound acronym ("Icam DEFinition for Function Modeling", where ICAM is an acronym for "Integrated Computer Aided Manufacturing"), is a function modeling methodology for describing manufacturing functions, which offers a functional mode ...
based on SADT, developed by
Douglas T. Ross in 1985.
*
Hatley-Pirbhai modeling, defined in "Strategies for Real-Time System Specification" by Derek J. Hatley and Imtiaz A. Pirbhai in 1988.
*
Modern Structured Analysis, developed by Edward Yourdon, after Essential System Analysis was published, and published in 1989.
*
Information technology engineering
Data engineering is a software engineering approach to the building of data systems, to enable the collection and usage of data. This data is usually used to enable subsequent data analytics, analysis and data science, which often involves machin ...
in circa 1990 with Finkelstein and popularised by
James Martin.
According to Hay (1999) "
information engineering
Information engineering is the engineering discipline that deals with the generation, distribution, analysis, and use of information, data, and knowledge in electrical systems. The field first became identifiable in the early 21st century.
Th ...
was a logical extension of the structured techniques that were developed during the 1970s. Structured programming led to structured design, which in turn led to structured systems analysis. These techniques were characterized by their use of
diagram
A diagram is a symbolic Depiction, representation of information using Visualization (graphics), visualization techniques. Diagrams have been used since prehistoric times on Cave painting, walls of caves, but became more prevalent during the Age o ...
s: structure charts for structured design, and data flow diagrams for structured analysis, both to aid in communication between users and developers, and to improve the analyst's and the designer's discipline. During the 1980s, tools began to appear which both automated the drawing of the diagrams, and kept track of the things drawn in a
data dictionary
A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle Corporation, ...
". After the example of
computer-aided design
Computer-aided design (CAD) is the use of computers (or ) to aid in the creation, modification, analysis, or optimization of a design. This software is used to increase the productivity of the designer, improve the quality of design, improve c ...
and
computer-aided manufacturing
Computer-aided manufacturing (CAM) also known as computer-aided modeling or computer-aided machining is the use of software to control machine tools in the manufacturing of work pieces. This is not the only definition for CAM, but it is the most ...
(CAD/CAM), the use of these tools was named
computer-aided software engineering
Computer-aided software engineering (CASE) is a domain of software tools used to design and implement applications. CASE tools are similar to and are partly inspired by computer-aided design (CAD) tools used for designing hardware products. CASE ...
(CASE).
Structured analysis topics
Single abstraction mechanism
Structured analysis typically creates a hierarchy employing a single abstraction mechanism. The structured analysis method can employ
IDEF (see figure), is process driven, and starts with a purpose and a viewpoint. This method identifies the overall function and iteratively divides functions into smaller functions, preserving inputs, outputs, controls, and mechanisms necessary to optimize processes. Also known as a
functional decomposition
In engineering, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts.
This process of decompo ...
approach, it focuses on cohesion within functions and coupling between functions leading to structured data.
The functional decomposition of the structured method describes the process without delineating system behavior and dictates system structure in the form of required functions. The method identifies inputs and outputs as related to the activities. One reason for the popularity of structured analysis is its intuitive ability to communicate high-level processes and concepts, whether in single system or enterprise levels. Discovering how objects might support functions for commercially prevalent
object-oriented
Object-oriented programming (OOP) is a programming paradigm based on the concept of '' objects''. Objects can contain data (called fields, attributes or properties) and have actions they can perform (called procedures or methods and impleme ...
development is unclear. In contrast to IDEF, the
UML is interface driven with multiple abstraction mechanisms useful in describing
service-oriented architectures (SOAs).
Approach
Structured analysis views a system from the perspective of the data flowing through it. The function of the system is described by processes that transform the data flows. Structured analysis takes advantage of information hiding through successive decomposition (or top down) analysis. This allows attention to be focused on pertinent details and avoids confusion from looking at irrelevant details. As the level of detail increases, the breadth of information is reduced. The result of structured analysis is a set of related graphical diagrams, process descriptions, and data definitions. They describe the transformations that need to take place and the data required to meet a system's
functional requirement
In software engineering and systems engineering, a functional requirement defines a function of a system or its component, where a function is described as a summary (or specification or statement) of behavior between inputs and outputs.
Functiona ...
s.
[
Alan Hecht and Andy Simmons (1986]
Integrating Automated Structured Analysis and Design with Ada Programming Support Environments
NASA 1986.
De Marco's approach consists of the following objects (see figure):
*
Context diagram
*
Data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
* Process specifications
*
Data dictionary
A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle Corporation, ...
Hereby the
data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s (DFDs) are directed graphs. The arcs represent
data
Data ( , ) are a collection of discrete or continuous values that convey information, describing the quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpreted for ...
, and the nodes (circles or bubbles) represent processes that transform the data. A process can be further decomposed to a more detailed DFD which shows the subprocesses and data flows within it. The subprocesses can in turn be decomposed further with another set of DFDs until their functions can be easily understood. Functional primitives are processes which do not need to be decomposed further. Functional primitives are described by a process specification (or mini-spec). The process specification can consist of pseudo-code,
flowchart
A flowchart is a type of diagram that represents a workflow or process. A flowchart can also be defined as a diagrammatic representation of an algorithm, a step-by-step approach to solving a task.
The flowchart shows the steps as boxes of v ...
s, or structured English. The DFDs model the structure of the system as a network of interconnected processes composed of functional primitives. The
data dictionary
A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle Corporation, ...
is a set of entries (definitions) of data flows, data elements, files, and databases. The data dictionary entries are partitioned in a top-down manner. They can be referenced in other data dictionary entries and in data flow diagrams.
Context diagram
Context diagrams are diagrams that represent the actors outside a system that could interact with that system.
This diagram is the highest level view of a
system
A system is a group of interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its open system (systems theory), environment, is described by its boundaries, str ...
, similar to
block diagram
A block diagram is a diagram of a system in which the principal parts or functions are represented by blocks connected by lines that show the relationships of the blocks. , showing a, possibly
software
Software consists of computer programs that instruct the Execution (computing), execution of a computer. Software also includes design documents and specifications.
The history of software is closely tied to the development of digital comput ...
-based, system as a whole and its inputs and outputs from/to external factors.
This type of diagram according to Kossiakoff (2003) usually "pictures the system at the center, with no details of its interior structure, surrounded by all its interacting systems, environment and activities. The objective of a system context diagram is to focus attention on external factors and events that should be considered in developing a complete set of system requirements and constraints".
[Alexander Kossiakoff, William N. Sweet (2003). ''Systems Engineering: Principles and Practices'' p. 413.] System context diagrams are related to
data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
, and show the interactions between a system and other actors which the system is designed to face. System context diagrams can be helpful in understanding the context in which the system will be part of
software engineering
Software engineering is a branch of both computer science and engineering focused on designing, developing, testing, and maintaining Application software, software applications. It involves applying engineering design process, engineering principl ...
.
Data dictionary
A
data dictionary
A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle Corporation, ...
or ''database dictionary'' is a file that defines the basic organization of a
database
In computing, a database is an organized collection of data or a type of data store based on the use of a database management system (DBMS), the software that interacts with end users, applications, and the database itself to capture and a ...
.
[Data Integration Glossary](_blank)
, U.S. Department of Transportation, August 2001. A database dictionary contains a list of all files in the database, the number of records in each file, and the names and types of each data field. Most
database management system
In computing, a database is an organized collection of data or a type of data store based on the use of a database management system (DBMS), the software that interacts with end users, applications, and the database itself to capture and an ...
s keep the data dictionary hidden from users to prevent them from accidentally destroying its contents. Data dictionaries do not contain any actual data from the database, only bookkeeping information for managing it. Without a data dictionary, however, a database management system cannot access data from the database.
Database users and
application developers can benefit from an authoritative data dictionary document that catalogs the organization, contents, and conventions of one or more databases. This typically includes the names and descriptions of various
tables and
fields
Fields may refer to:
Music
*Fields (band), an indie rock band formed in 2006
* Fields (progressive rock band), a progressive rock band formed in 1971
* ''Fields'' (album), an LP by Swedish-based indie rock band Junip (2010)
* "Fields", a song by ...
in each database, plus additional details, like the
type
Type may refer to:
Science and technology Computing
* Typing, producing text via a keyboard, typewriter, etc.
* Data type, collection of values used for computations.
* File type
* TYPE (DOS command), a command to display contents of a file.
* ...
and length of each
data element
In metadata, the term data element is an atomic unit of data that has precise meaning or precise semantics. A data element has:
# An identification such as a data element name
# A clear data element definition
# One or more representation term ...
. There is no universal standard as to the level of detail in such a document, but it is primarily a distillation of
metadata
Metadata (or metainformation) is "data that provides information about other data", but not the content of the data itself, such as the text of a message or the image itself. There are many distinct types of metadata, including:
* Descriptive ...
about
database structure, not the data itself. A data dictionary document also may include further information describing how data elements are encoded. One of the advantages of well-designed data dictionary documentation is that it helps to establish consistency throughout a complex database, or across a large collection of
federated database
A federated database system (FDBS) is a type of meta-database management system (DBMS), which transparently maps multiple autonomous database systems into a single federated database. The constituent databases are interconnected via a computer ne ...
s.
Data flow diagrams
A
data flow diagram
A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
(DFD) is a graphical representation of the "flow" of data through an
information system
An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, Information Processing and Management, store, and information distribution, distribute information. From a sociotechnical perspective, info ...
. It differs from the system
flowchart
A flowchart is a type of diagram that represents a workflow or process. A flowchart can also be defined as a diagrammatic representation of an algorithm, a step-by-step approach to solving a task.
The flowchart shows the steps as boxes of v ...
as it shows the flow of data through processes instead of
computer hardware
Computer hardware includes the physical parts of a computer, such as the central processing unit (CPU), random-access memory (RAM), motherboard, computer data storage, graphics card, sound card, and computer case. It includes external devices ...
. Data flow diagrams were invented by
Larry Constantine
Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
, developer of
structured design, based on Martin and Estrin's "data flow graph" model of computation.
It is common practice to draw a
system context diagram first which shows the interaction between the system and outside entities. The DFD is designed to show how a system is divided into smaller portions and to highlight the flow of data between those parts. This context-level data flow diagram is then "exploded" to show more detail of the system being modeled.
Data flow diagrams (DFDs) are one of the three essential perspectives of
structured systems analysis and design method
Structured systems analysis and design method (SSADM) is a systems approach to the analysis and design of information systems. SSADM was produced for the Central Computer and Telecommunications Agency, a UK government office concerned with the u ...
(SSADM). The sponsor of a project and the end users will need to be briefed and consulted throughout all stages of a system's evolution. With a data flow diagram, users are able to visualize how the system will operate, what the system will accomplish, and how the system will be implemented. The old system's data flow diagrams can be drawn up and compared with the new system's data flow diagrams to draw comparisons to implement a more efficient system. Data flow diagrams can be used to provide the end user with a physical idea of where the data they input ultimately has an effect upon the structure of the whole system from order to dispatch to recook. How any system is developed can be determined through a data flow diagram.
Structure chart
A
structure chart (SC) is a chart that shows the breakdown of the
configuration system to the lowest manageable levels.
["Configuration Management"](_blank)
In: ''IRS Resources Part 2. Information Technology Chapter 27. Configuration Management''. Accessed 14 Nov 2008. This chart is used in
structured programming Structured programming is a programming paradigm aimed at improving the clarity, quality, and development time of a computer program by making specific disciplined use of the structured control flow constructs of selection ( if/then/else) and repet ...
to arrange the program modules in a tree structure. Each module is represented by a box which contains the name of the modules. The tree structure visualizes the relationships between the modules.
Structure charts are used in structured analysis to specify the high-level design, or architecture, of a
computer program
A computer program is a sequence or set of instructions in a programming language for a computer to Execution (computing), execute. It is one component of software, which also includes software documentation, documentation and other intangibl ...
. As a design tool, they aid the programmer in dividing and conquering a large software problem, that is, recursively breaking a problem down into parts that are small enough to be understood by a human brain. The process is called
top-down design, or
functional decomposition
In engineering, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts.
This process of decompo ...
. Programmers use a structure chart to build a program in a manner similar to how an architect uses a blueprint to build a house. In the design stage, the chart is drawn and used as a way for the client and the various software designers to communicate. During the actual building of the program (implementation), the chart is continually referred to as the master-plan.
[David Wolber]
Structure Charts
: Supplementary Notes Structure Charts and Bottom-up Implementation: Java Version.
Structured design
Structured design (SD) is concerned with the development of modules and the synthesis of these modules in a so-called "module hierarchy". In order to design optimal module structure and interfaces two principles are crucial:
* ''
Cohesion'' which is "concerned with the grouping of functionally related processes into a particular module",
and
* ''
Coupling'' relates to "the flow of information or parameters passed between modules. Optimal coupling reduces the interfaces of modules and the resulting complexity of the software".
Structured design was developed by
Larry Constantine
Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
in the late 1960s, then refined and published with collaborators in the 1970s; see
Larry Constantine: structured design for details. has proposed his own approach which consists of three main objects :
* structure charts
* module specifications
* data dictionary.
The
structure chart aims to show "the module hierarchy or calling sequence relationship of modules. There is a module specification for each module shown on the structure chart. The module specifications can be composed of pseudo-code or a program design language. The
data dictionary
A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle Corporation, ...
is like that of structured analysis. At this stage in the
software development lifecycle
In software engineering, a software development process or software development life cycle (SDLC) is a process of planning and managing software development. It typically involves dividing software development work into smaller, parallel, or s ...
, after analysis and design have been performed, it is possible to automatically generate data type declarations",
[Belkhouche, B., and J.E. Urban. (1986). "Direct Implementation of Abstract Data Types from Abstract Specifications". In: ''IEEE Transactions on Software Engineering'' pp. 549-661, May 1986.] and procedure or subroutine templates.
Criticisms
Problems with data flow diagrams have included the following:
# Choosing bubbles appropriately
# Partitioning bubbles in a meaningful and mutually agreed upon manner,
# Documentation size needed to understand the Data Flows,
# Data flow diagrams are strongly functional in nature and thus subject to frequent change
# Though "data" flow is emphasized, "data" modeling is not, so there is little understanding the subject matter of the system
# Customers have difficulty following how the concept is mapped into data flows and bubbles
# Designers must shift the DFD organization into an implementable format
See also
*
Event partitioning
*
Flow-based programming
*
HIPO
*
Jackson structured programming
Jackson structured programming (JSP) is a method for structured programming developed by British software consultant Michael A. Jackson (computer scientist), Michael A. Jackson and was described in his 1975 book ''Principles of Program Design''.. ...
*
Prosa Structured Analysis Tool
*
Soft systems methodology
Soft systems methodology (SSM) is an organised way of thinking applicable to problematic social situations and in the management of change by using action. It was developed in England by academics at the Lancaster Systems Department on the basis o ...
References
Further reading
*
*
*
Tom DeMarco (1978). ''Structured Analysis and System Specification''. Yourdon.
*
* Derek J. Hatley, Imtiaz A. Pirbhai (1988). ''Strategies for Real Time System Specification''. John Wiley and Sons Ltd.
*
Stephen J. Mellor und Paul T. Ward (1986). ''Structured Development for Real-Time Systems: Implementation Modeling Techniques: 003''. Prentice Hall.
*
Edward Yourdon
Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis tec ...
(1989). ''Modern Structured Analysis'', Yourdon Press Computing Series, 1989,
* Keith Edwards (1993). ''Real-Time Structured Methods, System Analysis''. Wiley.
External links
Structured Analysis Wiki CRaG Systems, 2004.
{{DEFAULTSORT:Structured Analysis
Software design