HOME

TheInfoList



OR:

In
software engineering Software engineering is a systematic engineering approach to software development. A software engineer is a person who applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software. The term ' ...
, structured analysis (SA) and structured design (SD) are methods for analyzing business
requirements In product development and process optimization, a requirement is a singular documented physical or functional need that a particular design, product or process aims to satisfy. It is commonly used in a formal sense in engineering design, inclu ...
and developing
specification A specification often refers to a set of documented requirements to be satisfied by a material, design, product, or service. A specification is often a type of technical standard. There are different types of technical or engineering specificat ...
s for converting practices into
computer program A computer program is a sequence or set of instructions in a programming language for a computer to execute. Computer programs are one component of software, which also includes documentation and other intangible components. A computer progra ...
s, hardware configurations, and related manual procedures. Structured analysis and design techniques are fundamental tools of
systems analysis Systems analysis is "the process of studying a procedure or business to identify its goal and purposes and create systems and procedures that will efficiently achieve them". Another view sees system analysis as a problem-solving technique tha ...
. They developed from classical systems analysis of the 1960s and 1970s.


Objectives of structured analysis

Structured analysis became popular in the 1980s and is still in use today. Structured analysis consists of interpreting the system concept (or real world situations) into data and control terminology represented by
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s. The flow of data and control from bubble to the data store to bubble can be difficult to track and the number of bubbles can increase. One approach is to first define events from the outside world that require the system to react, then assign a bubble to that event. Bubbles that need to interact are then connected until the system is defined. Bubbles are usually grouped into higher level bubbles to decrease complexity.
Data dictionaries A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it a ...
are needed to describe the data and command flows, and a process specification is needed to capture the transaction/transformation information.FAA (2000)
''FAA System Safety Handbook, Appendix D''
December 30, 2000.
SA and SD are displayed with
structure chart A structure chart (SC) in software engineering and organizational theory is a chart which shows the breakdown of a system to its lowest manageable levels.IRS (2008) "Configuration Management" In: ''IRS Resources Part 2. Information Technology Cha ...
s,
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s and data model diagrams, of which there were many variations, including those developed by
Tom DeMarco Tom DeMarco (born August 20, 1940) is an American software engineer, author, and consultant on software engineering topics. He was an early developer of structured analysis in the 1970s. Early life and education Tom DeMarco was born in Hazle ...
,
Ken Orr Kenneth T. Orr (May 10, 1939 – June 14, 2016) was an American software engineer, executive and consultant, known for his contributions in the field of software engineering to structured analysis and with the Warnier/Orr diagram. Education Or ...
,
Larry Constantine Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
,
Vaughn Frick Carl Vaughn Frick – often credited as Vaughn Frick or simply Vaughn – is an alternative cartoonist known for the exploration of gay, environmental, HIV/AIDS awareness, and Radical Faeries, radical political themes in his comics. His ''Watch Out ...
,
Ed Yourdon Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis te ...
, Steven Ward,
Peter Chen Peter Pin-Shan Chen (; born 3 January 1947) is a Taiwanese American computer scientist. He is a (retired) distinguished career scientist and faculty member at Carnegie Mellon University and Distinguished Chair Professor Emeritus at LSU. He is k ...
, and others. These techniques were combined in various published system development methodologies, including
structured systems analysis and design method Structuring, also known as smurfing in banking jargon, is the practice of executing financial transactions such as making bank deposits in a specific pattern, calculated to avoid triggering financial institutions to file reports required by law ...
, profitable information by design (PRIDE), Nastec structured analysis & design, SDM/70 and the Spectrum structured system development methodology.


History

Structured analysis is part of a series of structured methods that represent a collection of analysis, design, and programming techniques that were developed in response to the problems facing the software world from the 1960s to the 1980s. In this timeframe most commercial programming was done in
Cobol COBOL (; an acronym for "common business-oriented language") is a compiled English-like computer programming language designed for business use. It is an imperative, procedural and, since 2002, object-oriented language. COBOL is primarily ...
and Fortran, then C and
BASIC BASIC (Beginners' All-purpose Symbolic Instruction Code) is a family of general-purpose, high-level programming languages designed for ease of use. The original version was created by John G. Kemeny and Thomas E. Kurtz at Dartmouth College ...
. There was little guidance on "good" design and programming techniques, and there were no standard techniques for documenting requirements and designs. Systems were getting larger and more complex, and the information system development became harder and harder to do so."Dave Levitt (2000). "Introduction to Structured Analysis and Design." at ''faculty.inverhills.edu/dlevitt''. Retrieved 21 Sep 2008. No longer online 2017. As a way to help manage large and complex software, the following structured methods emerged since the end of the 1960s: *
Structured programming Structured programming is a programming paradigm aimed at improving the clarity, quality, and development time of a computer program by making extensive use of the structured control flow constructs of selection ( if/then/else) and repetition (w ...
in circa 1967 with
Edsger Dijkstra Edsger Wybe Dijkstra ( ; ; 11 May 1930 – 6 August 2002) was a Dutch computer scientist, programmer, software engineer, systems scientist, and science essayist. He received the 1972 Turing Award for fundamental contributions to developing progr ...
- "Go To Statement Considered Harmful" *
Niklaus Wirth Niklaus Emil Wirth (born 15 February 1934) is a Swiss computer scientist. He has designed several programming languages, including Pascal, and pioneered several classic topics in software engineering. In 1984, he won the Turing Award, generally ...
Stepwise design in 1971 *
Nassi–Shneiderman diagram A Nassi–Shneiderman diagram (NSD) in computer programming is a graphical design representation for structured programming. This type of diagram was developed in 1972 by Isaac Nassi and Ben Shneiderman who were both graduate students at Ston ...
in 1972 *
Warnier/Orr diagram A Warnier/Orr diagram (also known as a logical construction of a program/system) is a kind of hierarchical flowchart that allows the description of the organization of data and procedures. They were initially developed 1976, in France by Jean-Domin ...
in 1974 - "Logical Construction of Programs" * HIPO in 1974 - IBM Hierarchy input-process-output (though this should really be output-input-process) * Structured design around 1975 with
Larry Constantine Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
,
Ed Yourdon Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis te ...
and
Wayne Stevens Wayne Stevens may refer to: *Wayne Stevens (basketball) (born 1936), American basketballer *Wayne Stevens (software engineer) (1944–1993), American software engineer * Bones Hillman (Wayne Stevens, 1958–2020), New Zealand musician *Wayne Steven ...
. * Jackson structured programming in circa 1975 developed by Michael A. Jackson * Structured analysis in circa 1978 with
Tom DeMarco Tom DeMarco (born August 20, 1940) is an American software engineer, author, and consultant on software engineering topics. He was an early developer of structured analysis in the 1970s. Early life and education Tom DeMarco was born in Hazle ...
,
Edward Yourdon Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis te ...
, Gane & Sarson, McMenamin & Palmer. *
Structured analysis and design technique Structured analysis and design technique (SADT) is a systems engineering and software engineering methodology for describing systems as a hierarchy of functions. SADT is a structured analysis modelling language, which uses two types of diagrams ...
(SADT) developed by
Douglas T. Ross Douglas Taylor "Doug" Ross (21 December 1929 – 31 January 2007) was an American computer scientist pioneer, and chairman of SofTech, Inc. He is most famous for originating the term CAD for computer-aided design, and is considered to be the fat ...
* Yourdon structured method developed by
Edward Yourdon Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis te ...
. * Structured analysis and system specification published in 1978 by
Tom DeMarco Tom DeMarco (born August 20, 1940) is an American software engineer, author, and consultant on software engineering topics. He was an early developer of structured analysis in the 1970s. Early life and education Tom DeMarco was born in Hazle ...
. *
Structured systems analysis and design method Structuring, also known as smurfing in banking jargon, is the practice of executing financial transactions such as making bank deposits in a specific pattern, calculated to avoid triggering financial institutions to file reports required by law ...
(SSADM) first presented in 1983 developed by the UK
Office of Government Commerce The Office of Government Commerce (OGC) was a UK Government Office established as part of HM Treasury in 2000. It was moved into the Efficiency and Reform Group of the Cabinet Office in 2010, before being closed in 2011. Overview A ''Review of ...
. * Essential Systems Analysis, proposed by Stephen M. McMenamin and John F. Palmer *
IDEF0 IDEF0, a compound acronym ("Icam DEFinition for Function Modeling", where ICAM is an acronym for "Integrated Computer Aided Manufacturing"), is a function modeling methodology for describing manufacturing functions, which offers a functional model ...
based on SADT, developed by
Douglas T. Ross Douglas Taylor "Doug" Ross (21 December 1929 – 31 January 2007) was an American computer scientist pioneer, and chairman of SofTech, Inc. He is most famous for originating the term CAD for computer-aided design, and is considered to be the fat ...
in 1985. * Hatley-Pirbhai modeling, defined in "Strategies for Real-Time System Specification" by Derek J. Hatley and Imtiaz A. Pirbhai in 1988. *
Modern Structured Analysis Modern may refer to: History *Modern history ** Early Modern period ** Late Modern period *** 18th century *** 19th century *** 20th century ** Contemporary history * Moderns, a faction of Freemasonry that existed in the 18th century Philosophy ...
, developed by Edward Yourdon, after Essential System Analysis was published, and published in 1989. *
Information technology engineering Data engineering refers to the building of systems to enable the collection and usage of data. This data is usually used to enable subsequent analysis and data science; which often involves machine learning. Making the data usable usually involve ...
in circa 1990 with Finkelstein and popularised by James Martin. According to Hay (1999) "
information engineering Information engineering is the engineering discipline that deals with the generation, distribution, analysis, and use of information, data, and knowledge in systems. The field first became identifiable in the early 21st century. The component ...
was a logical extension of the structured techniques that were developed during the 1970s. Structured programming led to structured design, which in turn led to structured systems analysis. These techniques were characterized by their use of
diagram A diagram is a symbolic representation of information using visualization techniques. Diagrams have been used since prehistoric times on walls of caves, but became more prevalent during the Enlightenment. Sometimes, the technique uses a three ...
s: structure charts for structured design, and data flow diagrams for structured analysis, both to aid in communication between users and developers, and to improve the analyst's and the designer's discipline. During the 1980s, tools began to appear which both automated the drawing of the diagrams, and kept track of the things drawn in a
data dictionary A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it ...
". After the example of computer-aided design and
computer-aided manufacturing Computer-aided manufacturing (CAM) also known as computer-aided modeling or computer-aided machining is the use of software to control machine tools in the manufacturing of work pieces. This is not the only definition for CAM, but it is the most ...
(CAD/CAM), the use of these tools was named
computer-aided software engineering Computer-aided software engineering (CASE) is the domain of software tools used to design and implement applications. CASE tools are similar to and were partly inspired by Computer-Aided Design (CAD) tools used for designing hardware products. CA ...
(CASE).


Structured analysis topics


Single abstraction mechanism

Structured analysis typically creates a hierarchy employing a single abstraction mechanism. The structured analysis method can employ
IDEF IDEF, initially an abbreviation of ICAM Definition and renamed in 1999 as Integration Definition,IEEE Standard for Functional Modeling Language—Syntax and Semantics for IDEF0, Software Engineering Standards Committee of the IEEE Computer Soci ...
(see figure), is process driven, and starts with a purpose and a viewpoint. This method identifies the overall function and iteratively divides functions into smaller functions, preserving inputs, outputs, controls, and mechanisms necessary to optimize processes. Also known as a
functional decomposition In mathematics, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts by function composition. ...
approach, it focuses on cohesion within functions and coupling between functions leading to structured data. The functional decomposition of the structured method describes the process without delineating system behavior and dictates system structure in the form of required functions. The method identifies inputs and outputs as related to the activities. One reason for the popularity of structured analysis is its intuitive ability to communicate high-level processes and concepts, whether in single system or enterprise levels. Discovering how objects might support functions for commercially prevalent
object-oriented Object-oriented programming (OOP) is a programming paradigm based on the concept of " objects", which can contain data and code. The data is in the form of fields (often known as attributes or ''properties''), and the code is in the form of ...
development is unclear. In contrast to IDEF, the
UML The Unified Modeling Language (UML) is a general-purpose, developmental modeling language in the field of software engineering that is intended to provide a standard way to visualize the design of a system. The creation of UML was originally ...
is interface driven with multiple abstraction mechanisms useful in describing service-oriented architectures (SOAs).


Approach

Structured analysis views a system from the perspective of the data flowing through it. The function of the system is described by processes that transform the data flows. Structured analysis takes advantage of information hiding through successive decomposition (or top down) analysis. This allows attention to be focused on pertinent details and avoids confusion from looking at irrelevant details. As the level of detail increases, the breadth of information is reduced. The result of structured analysis is a set of related graphical diagrams, process descriptions, and data definitions. They describe the transformations that need to take place and the data required to meet a system's functional requirements. Alan Hecht and Andy Simmons (1986
Integrating Automated Structured Analysis and Design with Ada Programming Support Environments
NASA 1986.
De Marco's approach consists of the following objects (see figure): *
Context diagram A system context diagram (SCD) in engineering is a diagram that defines the boundary between the system, or part of a system, and its environment, showing the entities that interact with it. This diagram is a high level view of a system. It is s ...
*
Data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
* Process specifications *
Data dictionary A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it ...
Hereby the
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
s (DFDs) are directed graphs. The arcs represent
data In the pursuit of knowledge, data (; ) is a collection of discrete values that convey information, describing quantity, quality, fact, statistics, other basic units of meaning, or simply sequences of symbols that may be further interpret ...
, and the nodes (circles or bubbles) represent processes that transform the data. A process can be further decomposed to a more detailed DFD which shows the subprocesses and data flows within it. The subprocesses can in turn be decomposed further with another set of DFDs until their functions can be easily understood. Functional primitives are processes which do not need to be decomposed further. Functional primitives are described by a process specification (or mini-spec). The process specification can consist of pseudo-code,
flowchart A flowchart is a type of diagram that represents a workflow or process. A flowchart can also be defined as a diagrammatic representation of an algorithm, a step-by-step approach to solving a task. The flowchart shows the steps as boxes of va ...
s, or structured English. The DFDs model the structure of the system as a network of interconnected processes composed of functional primitives. The
data dictionary A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it ...
is a set of entries (definitions) of data flows, data elements, files, and databases. The data dictionary entries are partitioned in a top-down manner. They can be referenced in other data dictionary entries and in data flow diagrams.


Context diagram

Context diagrams are diagrams that represent the actors outside a system that could interact with that system. This diagram is the highest level view of a system, similar to
block diagram A block diagram is a diagram of a system in which the principal parts or functions are represented by blocks connected by lines that show the relationships of the blocks.
, showing a, possibly
software Software is a set of computer programs and associated software documentation, documentation and data (computing), data. This is in contrast to Computer hardware, hardware, from which the system is built and which actually performs the work. ...
-based, system as a whole and its inputs and outputs from/to external factors. This type of diagram according to Kossiakoff (2003) usually "pictures the system at the center, with no details of its interior structure, surrounded by all its interacting systems, environment and activities. The objective of a system context diagram is to focus attention on external factors and events that should be considered in developing a complete set of system requirements and constraints".Alexander Kossiakoff, William N. Sweet (2003). ''Systems Engineering: Principles and Practices'' p. 413. System context diagrams are related to
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
, and show the interactions between a system and other actors which the system is designed to face. System context diagrams can be helpful in understanding the context in which the system will be part of
software engineering Software engineering is a systematic engineering approach to software development. A software engineer is a person who applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software. The term ' ...
.


Data dictionary

A
data dictionary A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it ...
or ''database dictionary'' is a file that defines the basic organization of a
database In computing, a database is an organized collection of data stored and accessed electronically. Small databases can be stored on a file system, while large databases are hosted on computer clusters or cloud storage. The design of databases spa ...
.Data Integration Glossary
, U.S. Department of Transportation, August 2001.
A database dictionary contains a list of all files in the database, the number of records in each file, and the names and types of each data field. Most
database management system In computing, a database is an organized collection of data stored and accessed electronically. Small databases can be stored on a file system, while large databases are hosted on computer clusters or cloud storage. The design of databases span ...
s keep the data dictionary hidden from users to prevent them from accidentally destroying its contents. Data dictionaries do not contain any actual data from the database, only bookkeeping information for managing it. Without a data dictionary, however, a database management system cannot access data from the database. Database users and application developers can benefit from an authoritative data dictionary document that catalogs the organization, contents, and conventions of one or more databases. This typically includes the names and descriptions of various
tables Table may refer to: * Table (furniture), a piece of furniture with a flat surface and one or more legs * Table (landform), a flat area of land * Table (information), a data arrangement with rows and columns * Table (database), how the table data ...
and fields in each database, plus additional details, like the
type Type may refer to: Science and technology Computing * Typing, producing text via a keyboard, typewriter, etc. * Data type, collection of values used for computations. * File type * TYPE (DOS command), a command to display contents of a file. * Ty ...
and length of each
data element In metadata, the term data element is an atomic unit of data that has precise meaning or precise semantics. A data element has: # An identification such as a data element name # A clear data element definition # One or more representation terms ...
. There is no universal standard as to the level of detail in such a document, but it is primarily a distillation of metadata about database structure, not the data itself. A data dictionary document also may include further information describing how data elements are encoded. One of the advantages of well-designed data dictionary documentation is that it helps to establish consistency throughout a complex database, or across a large collection of federated databases.


Data flow diagrams

A
data flow diagram A data-flow diagram is a way of representing a flow of data through a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram ha ...
(DFD) is a graphical representation of the "flow" of data through an
information system An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, store, and distribute information. From a sociotechnical perspective, information systems are composed by four components: task, people ...
. It differs from the system
flowchart A flowchart is a type of diagram that represents a workflow or process. A flowchart can also be defined as a diagrammatic representation of an algorithm, a step-by-step approach to solving a task. The flowchart shows the steps as boxes of va ...
as it shows the flow of data through processes instead of computer hardware. Data flow diagrams were invented by
Larry Constantine Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
, developer of structured design, based on Martin and Estrin's "data flow graph" model of computation. It is common practice to draw a system context diagram first which shows the interaction between the system and outside entities. The DFD is designed to show how a system is divided into smaller portions and to highlight the flow of data between those parts. This context-level data flow diagram is then "exploded" to show more detail of the system being modeled. Data flow diagrams (DFDs) are one of the three essential perspectives of
structured systems analysis and design method Structuring, also known as smurfing in banking jargon, is the practice of executing financial transactions such as making bank deposits in a specific pattern, calculated to avoid triggering financial institutions to file reports required by law ...
(SSADM). The sponsor of a project and the end users will need to be briefed and consulted throughout all stages of a system's evolution. With a data flow diagram, users are able to visualize how the system will operate, what the system will accomplish, and how the system will be implemented. The old system's data flow diagrams can be drawn up and compared with the new system's data flow diagrams to draw comparisons to implement a more efficient system. Data flow diagrams can be used to provide the end user with a physical idea of where the data they input ultimately has an effect upon the structure of the whole system from order to dispatch to recook. How any system is developed can be determined through a data flow diagram.


Structure chart

A
structure chart A structure chart (SC) in software engineering and organizational theory is a chart which shows the breakdown of a system to its lowest manageable levels.IRS (2008) "Configuration Management" In: ''IRS Resources Part 2. Information Technology Cha ...
(SC) is a chart that shows the breakdown of the configuration system to the lowest manageable levels."Configuration Management"
In: ''IRS Resources Part 2. Information Technology Chapter 27. Configuration Management''. Accessed 14 Nov 2008.
This chart is used in
structured programming Structured programming is a programming paradigm aimed at improving the clarity, quality, and development time of a computer program by making extensive use of the structured control flow constructs of selection ( if/then/else) and repetition (w ...
to arrange the program modules in a tree structure. Each module is represented by a box which contains the name of the modules. The tree structure visualizes the relationships between the modules. Structure charts are used in structured analysis to specify the high-level design, or architecture, of a
computer program A computer program is a sequence or set of instructions in a programming language for a computer to execute. Computer programs are one component of software, which also includes documentation and other intangible components. A computer progra ...
. As a design tool, they aid the programmer in dividing and conquering a large software problem, that is, recursively breaking a problem down into parts that are small enough to be understood by a human brain. The process is called top-down design, or
functional decomposition In mathematics, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts by function composition. ...
. Programmers use a structure chart to build a program in a manner similar to how an architect uses a blueprint to build a house. In the design stage, the chart is drawn and used as a way for the client and the various software designers to communicate. During the actual building of the program (implementation), the chart is continually referred to as the master-plan.David Wolber
Structure Charts
: Supplementary Notes Structure Charts and Bottom-up Implementation: Java Version.


Structured design

Structured design (SD) is concerned with the development of modules and the synthesis of these modules in a so-called "module hierarchy". In order to design optimal module structure and interfaces two principles are crucial: * '' Cohesion'' which is "concerned with the grouping of functionally related processes into a particular module", and * ''
Coupling A coupling is a device used to connect two shafts together at their ends for the purpose of transmitting power. The primary purpose of couplings is to join two pieces of rotating equipment while permitting some degree of misalignment or end mov ...
'' relates to "the flow of information or parameters passed between modules. Optimal coupling reduces the interfaces of modules and the resulting complexity of the software". Structured design was developed by
Larry Constantine Larry LeRoy Constantine (born 1943) is an American software engineer, professor in the Center for Exact Sciences and Engineering at the University of Madeira Portugal, and considered one of the pioneers of computing. He has contributed numerous ...
in the late 1960s, then refined and published with collaborators in the 1970s; see Larry Constantine: structured design for details. has proposed his own approach which consists of three main objects : * structure charts * module specifications * data dictionary. The
structure chart A structure chart (SC) in software engineering and organizational theory is a chart which shows the breakdown of a system to its lowest manageable levels.IRS (2008) "Configuration Management" In: ''IRS Resources Part 2. Information Technology Cha ...
aims to show "the module hierarchy or calling sequence relationship of modules. There is a module specification for each module shown on the structure chart. The module specifications can be composed of pseudo-code or a program design language. The
data dictionary A data dictionary, or metadata repository, as defined in the ''IBM Dictionary of Computing'', is a "centralized repository of information about data such as meaning, relationships to other data, origin, usage, and format". ''Oracle'' defines it ...
is like that of structured analysis. At this stage in the
software development lifecycle In software engineering, a software development process is a process of dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design, product management. It is also known as a software deve ...
, after analysis and design have been performed, it is possible to automatically generate data type declarations",Belkhouche, B., and J.E. Urban. (1986). "Direct Implementation of Abstract Data Types from Abstract Specifications". In: ''IEEE Transactions on Software Engineering'' pp. 549-661, May 1986. and procedure or subroutine templates.


Criticisms

Problems with data flow diagrams have included the following: # Choosing bubbles appropriately # Partitioning bubbles in a meaningful and mutually agreed upon manner, # Documentation size needed to understand the Data Flows, # Data flow diagrams are strongly functional in nature and thus subject to frequent change # Though "data" flow is emphasized, "data" modeling is not, so there is little understanding the subject matter of the system # Customers have difficulty following how the concept is mapped into data flows and bubbles # Designers must shift the DFD organization into an implementable format


See also

*
Event partitioning Event partitioning is an easy-to-apply systems analysis technique that helps the analyst organize requirements for large systems into a collection of smaller, simpler, minimally-connected, easier-to-understand "mini systems" / use cases. Overview ...
*
Flow-based programming In computer programming, flow-based programming (FBP) is a programming paradigm that defines applications as networks of "black box" processes, which exchange data across predefined connections by message passing, where the connections are specif ...
* HIPO * Jackson structured programming *
Prosa Structured Analysis Tool Prosa Structured Analysis Tool is a visual systems and software development environment which supports industry standard SA/SD/RT structured analysis and design with real-time extensions modeling method. Prosa supports data flow diagrams, state tr ...
*
Soft systems methodology Soft systems methodology (SSM) is an organised way of thinking and it can be used to tackle general problematic situations that arise in the real world and in the management of change by using action. Developed in England by academics at the Lancast ...


References


Further reading

* * *
Tom DeMarco Tom DeMarco (born August 20, 1940) is an American software engineer, author, and consultant on software engineering topics. He was an early developer of structured analysis in the 1970s. Early life and education Tom DeMarco was born in Hazle ...
(1978). ''Structured Analysis and System Specification''. Yourdon. * * Derek J. Hatley, Imtiaz A. Pirbhai (1988). ''Strategies for Real Time System Specification''. John Wiley and Sons Ltd. * Stephen J. Mellor und Paul T. Ward (1986). ''Structured Development for Real-Time Systems: Implementation Modeling Techniques: 003''. Prentice Hall. *
Edward Yourdon Edward Nash Yourdon (April 30, 1944 – January 20, 2016) was an American software engineer, computer consultant, author and lecturer, and software engineering methodology pioneer. He was one of the lead developers of the structured analysis te ...
(1989). ''Modern Structured Analysis'', Yourdon Press Computing Series, 1989, * Keith Edwards (1993). ''Real-Time Structured Methods, System Analysis''. Wiley.


External links


Structured Analysis Wiki


CRaG Systems, 2004. {{DEFAULTSORT:Structured Analysis Software design