Photo by Markus Spiske on Unsplash

Information Systems Analysis and Design

Samet Girgin
PursuitOfData
Published in
15 min readJun 26, 2024

--

Requirement Analysis:

A requirement can simply be defined as “everything necessary and desired”. A requirement is any stakeholder’s need, expectation, constraint, or interface that must be satisfied by the proposed software product during development.

Determining software requirements requires a systematic approach. Because the cost of an incorrectly determined requirement to the system is much higher than one that is not determined at all. Therefore, certain characteristics must be met for each requirement.

  • Consistent
  • Complete
  • Feasible
  • Necessary
  • Correct
  • Traceable
  • Verifiable

Functional Requirements:

  • “What” a software should do is often determined by the requirements of its end-user or customer. These requirements are called “functional requirements”.

Non-Functional Requirements:

  • Non-functional requirements determine the difference between software developed to fulfill the same purpose.

Functional vs Non-Functional Requirements:

  • Functional requirements inherently relate to a specific part of a system and will continue to function in the event of failure or distress, even if the system as a whole is affected. For example, if the credit card payment screen does not work in an e-commerce system, the system will continue to work somehow with other payment channels. However, non-functional requirements are related to the entire system and in case of a possible problem, the system may become unusable. If an e-commerce system includes a payment system that cannot meet legal requirements, the system will be stopped for reliability reasons, even if the system works. The comparison table for functional and non-functional requirements is given below.

Requirements Engineering:

Requirements engineering process
  • Requirement Discovery: An entire system is built to meet the requirements. The implicit purposes, rules, and policies of a wide variety of sources should be treated as requirements because they will shape the system in some way. However, determining these requirements may not be easy. For this reason, different methods are used in the requirements discovery phase.

Requirement Analysis:

1- Evaluation: 3 different feasibility inquiries

  • Technical
  • Economical
  • Chart evaluation

2- Classification

3- Conflict Resolution: Some of the grouped requirements may conflict with other requirements. This may result in an unreliable final product in subsequent design stages. This conflict between requirements must be resolved with a structural method. Conflict resolution is generally accomplished in four steps.

  • (1) Conflict identification, (2) Conflict analysis, (3) Conflict resolution (Treaty, compromise, voting, variants, override), and (4) Documentation

4- Prioritization: Prioritizing a new list of requirements with conflicts resolved is especially valuable for incremental development approaches. Projects that are started directly without prioritization, especially if the first versions do not meet expectations, may limit the participation of stakeholders and directly reduce the performance of subsequent processes. Which methods can be used?

  • Analytical Hierarchy Process
  • Cumulative Voting ($100 test): Each participant is given $100 to be divided among the list of requirements. Participants assign the amount they want, in their opinion, to a requirement. Then, the total priority score is determined by adding up the amounts assigned to the requirements. In very high numbers of requirements, higher amounts can be selected to apply the method.
  • Sorting
  • Grouping
  • Card Sorting Technique

Requirement Verification: While verifying the requirements, five different aspects are checked.

  • Validity: Does the system provide functions that best support the customer’s needs?
  • Consistency: Are there any conflicts of requirements?
  • Completeness: Are all the functions the customer wants included?
  • Realism: Can the requirement be fulfilled under current budget and technology conditions?
  • Verifiability: Can the system be tested to determine whether the requirements are met?

Requirement Documentation:

  • Accurate documentation of requirements through all phases is critical for the subsequent process, data, and logic modeling phases. In this sense, the most basic requirements document is called a “requirements statement” and includes the requirement, requirement type, importance, and source.

Process Modelling

A model is a representation of reality, and system models are a conceptual representation of the reality required to meet the requirements.

  • Logical Models: Indicators of “what a system is and does,” regardless of its application.
  • Physical Models: It shows not only “what the system is” but also “how it will be implemented” technically.

Data Flow Diagrams: A data flow diagram presents the flow of data within the information system. This flow does not present the data transformation logic and process descriptions in the process. In other words, it is concerned with “What” the system does and does not take into account “How” it does it. In this sense, they are considered a “Black Box” approach.

Data Flow Diagrams Elements:

  • Data Flow: A data flow cannot return directly to the source from which it started. It must be processed by at least one process. Data flow towards the data store indicates “update”. The data flow coming out of the data store symbolizes “information extraction or use”
  • Process: It indicates the change or transformation of data and represents the work performed in the system.
  • Data Storage: Data storage should not only receive or send data. Each designed data store is expected to both receive and send data
  • External Entity: External entities should only be linked to processes, similar to data stores. Whether it is connected to other external entities or data stores is irrelevant in logical drawings. In fact, non-system entities may have relationships with each other, but this relationship does not make sense for data flow diagrams drawn at the logical level.

Data Flow Diagram Development:

  • Context diagrams are high-level diagrams, meaning they don’t go into the detailed ins and outs of the system. A context diagram contains at most 1 process.
  • Parent Diagram: Parent diagrams, which extend the context diagram, are a data flow diagram that represents the main processes, data flows, and data stores of a system at a high level of detail.
  • Child Diagram: Data flow diagrams are diagrams drawn from general to specific and the context diagram is at the top level. The last level in the child diagram is called the primitive diagram and is often determined by the preference of the system analyst.
  • Balancing in Data Flow Diagrams: Lower-level diagrams should not conflict with upper-level diagrams, that is, they should be balanced with the upper level. Balancing is about transferring data flows exactly as they pass from the upper level to the lower level.

Logical and Physical Data Flow Diagrams:

It can be said that logical diagrams have the following advantages

  • Better communication with users
  • More stable systems
  • Better understanding of the business by analysts
  • Flexibility and maintenance
  • Elimination of redundancies and easier creation of the physical model

The use of physical diagrams brings the following advantages.

  • Identifying which processes are carried out by humans and which processes are automated
  • Defining the processes in more detail
  • Ordering the processes as they should be in a special order
  • • Identifying temporary data stores
  • Identifying the names of specific and current files and outputs
  • Adding controls to ensure processes are working properly.

Data and Logic Modelling:

Data Modeling is the process of analyzing data objects and their relationship to other subjects. It is used to determine the data required for workpieces. Data models are created for data to be stored in a database. The most basic data models are called data dictionaries. A Data Dictionary is a collection of names, definitions, and attributes associated with data elements used in a database, information system, or part of a research project. (IBM)

A data dictionary is created in five stages.

1- The first stage is to understand each flow in the data flow diagrams.

2- Data flows need to be converted into data structures.

3- Data structures are converted into structured records. At this stage, special records defining each data structure will be determined.

4- The basic dictionary is prepared in the fourth stage by turning each structural record into data elements.

5- Data stores are designed for data records.

Logic Modelling: In the logic modeling phase, three basic methods are preferred: structural language, decision tables, and decision trees. This modeling process explains what each transformation process looks like

1- Structural Language: It is a verbal approach modified from written language to better understand decision models.

2- Decision Table: It is the representation of the logic behind a decision with matrices. Describes possible conditions and alternative actions to be taken under these conditions.

  • Conditions
  • Actions
  • Rules

3-Decision Trees: In systems analysis, trees are mainly used to identify and organize conditions and actions in a fully structured decision process. It is a graphical representation of a decision situation. Whether there are more than 4 alternatives for any decision condition, decision trees are used.

Structural Language - Decision Table- Decision Trees

Architectural Design:

Architectural design, in its most basic definition, is the transformation of logically prepared designs into physical ones.

Critical Success Factors in Architectural Design:

  • Corporate organization and culture
  • Existing enterprise resource planning software
  • Startup Costs
  • Scalability
  • Web integration
  • Requirements for the previous system
  • Data processing
  • Security

Architectural Design Requirements:

  • Operational Requirements: Technical Environment, System Integration, Portability, Sustainability
  • Performance Requirements: Speed, capacity, security
  • Security Requirements: System Value Estimates, Access Control, Encryption and Authorization, Virus Control.
  • Cultural and Political Requirements: Multilingualism, Customization, clarification of norms, Legal

Architectural Design: A server is a computer that provides services to one or more computers called clients.

Evolutionary development of software architectures
  • Client/Server Architecture: In a typical client/server architecture, the client handles the entire user interface, including data entry, data querying, and screen presentation. The server stores data in databases while also providing data access and database management functions.
  • Layered architecture: It is the layered design of a system by transferring certain functions to different levels.
  • Data-centric Architecture: In this architecture, there is a central data store and client software surrounding it. The main feature of the system is that the only way for each component to interact with each other is with the help of a central data store.
  • Data Flow Architecture: In this approach, also called flow (pipe) and filter architecture, independent functions connected in series take the output of the previous function as input and send it to the next function through a certain transformation process.

Network Architecture:

  • Network is the infrastructure that allows the sharing of hardware, software, and data resources so that users can improve their competencies at low cost.
  • Developed by ISO (Organization for International Standards), the OSI (Open system interface) model describes how data is moved from an application on one computer to an application on another networked computer.
  • Network Topologies: The configuration of the connections and nodes of a network in a way that they relate to each other is called network topology. Topologies are categorized as physical network topology, which covers the signal transmission media infrastructure, or logical network topology, which describes the way data travels between devices over the network.
Hierarchical Network Topology — Data Way Topology (When there is a failure in the basic backbone, the operation of the entire network will stop.) — Ring Network
Star Topology — Mesh Network Topology

Data Processing Methods:

  • An online processing system evaluates the movement as and where it occurs and produces output for users.
  • Batch processing means combining data and processing it in batches at regular intervals. This is the basic approach for the first information systems. The first management information systems were batch processing systems that returned reports to company managers at predetermined times. Nowadays, although batch processing is advantageous in some areas, companies are turning to online systems.

The Impact of the Internet:

  • Internet-based systems
  • Virtualization
  • Cloud computing
  • Cloud computing service models: IaaS, PaaS, SaaS, DaaS (Data as a Service), AaaS (Analytics as a service)
  • E-trade

Database Design:

There are some special fields called keys. These fields are especially valuable for relational databases

  • Primary Key: A unique identifier for each record in an entity. It ensures that each record can be uniquely identified. For example, CustomerID could be the primary key for a Customer entity.
  • Foreign Key: An attribute in one entity that links to the primary key of another entity, establishing a relationship between the two entities. For instance, OrderID in an OrderDetails table might be a foreign key referencing the Orders table.
  • Super Key: The set of all keys that help uniquely identify rows in a table.
  • Candidate Key: Attributes that uniquely identify the rows of a table. The primary key is selected from one of the candidate keys.
  • Alternative Key: All candidate keys that are not selected as primary keys.
  • Composite key: It is also known as a compound key, which is a combination of two or more columns in a table that together uniquely identify a row in that table.
  • Unique Key: It is a key that contains unique values. It is different from the primary key in that it allows a single null value.

Database Management System: They are software that allows operations such as creating databases and tables, entering data, searching data, obtaining information (or new data) from data, updating and deleting data, authorization, security, and maintenance.

Database Models:

  • Databases are driven by different data models. Although some of these data models have lost popularity over the years, they are important for understanding overall database evolution.

1- Hierarchical Model: Data is organized into a tree-like structure, where each record has a single parent and can have multiple children.

  • Example: A file system, where directories contain files and other directories.

2- Network Model: Similar to the hierarchical model, records can have multiple parent and child records, forming a graph-like structure.

  • Example: A social network where users can be connected to multiple other users.

3- Relational Model: Data is organized into tables (relations) consisting of rows (tuples) and columns (attributes). Relationships between tables are defined through foreign keys.

  • Example: A customer database where customer information, orders, and products are stored in separate tables.

4. Object-Oriented Model: Data is stored as objects, similar to object-oriented programming. Each object contains both data and methods that operate on the data.

  • Example: A multimedia database where images, videos, and documents are stored as objects.

Entity-Relationship Model (ER Model): Uses entities (objects) and relationships to represent data. Entities have attributes, and relationships define how entities interact.

  • Example: An ER diagram depicting customers, orders, and products in an e-commerce system.
  • ERD (Entity-Relationship Diagram): Entity-Feature-Relationship
Entity relationship diagram elements for a conceptual model

Normalization: Normalization in database design is the process of organizing data to reduce redundancy and improve data integrity. It involves dividing a database into two or more tables and defining relationships between the tables. The goal of normalization is to minimize duplicate data and ensure data dependencies make sense, leading to a more efficient and logical database design.

  • 1NF: 1. There cannot be repeating columns in the same table. 2. There can only be one value in each column. 3. Each row must be identified with a unique key
  • 2NF: 1. Table must be 1NF, 2. There should be no partial dependency between non-key values ​​and primary keys. A partial dependency occurs when any non-key value depends on only part of a primary key. 3. Any subset of data should not be repeated across multiple rows. New tables must be created for such data subsets. 4. Relationships should be defined between the main table and new tables using foreign keys.
  • 3NF: 1. Database must be 2NF, 2. No non-key column should be relative to another (non-key column), in other words, each column must be fully dependent on the unique key.
1NF-2NF-3NF Examples

Database Design Process: Database designs are carried out sequentially at 3 different levels.

  • Conceptual Design: It refers to entities and relationships between entities. This design consists of a visual representation of the design requirements expressed verbally and is made to be clearly and well understood by the analyst.
  • Logical Design
  • Physical Design
Conceptual-Logical-Physical Design

Human-Computer Interaction:

The basic concept that defines people’s interaction with a product or service is user experience. If the final product here is software, this concept is often considered within the scope of human-computer interaction (HCI).

An interface is an interface between the end user and the information system and manages input, output, and interaction. The process of developing a software interface for user experience is evaluated in 5 stages

  • Strategy: User Requirements and Work Purposes
  • Scope: Functional properties, content requirements
  • Structure: Input, Output, interaction design
  • Spine: Interface Design
  • Presentation: Visual Design

Input, Output and Interaction Design:

  • Input Design: The following factors must be met when creating input designs; Efficiency, effectiveness, ease of use, stability, simplicity, and attractiveness. Input checks are; completeness, format, range (interval), consistency, and database checks.
  • Output Design: Printed materials are one of the basic types of output, the most preferred being business reports. Although business reports have gradually lost their popularity over the years due to the development of electronic business and environmental concerns, many businesses still prefer this type of printed material.
Infographic Design Types — Different output media comparisons
  • Interaction Design: Interaction can be expressed as the communication between a system user and the information system, or in other words, establishing a dialogue. If this dialogue is not established correctly, unfortunately, the system will not be successful in any way.

User Interface:

  • Graphic User Interface — GUI
  • Website design: 7 basic points need to be considered when designing a website. 1-Structure 2- Content 3- Text 4- Graphics 5- Presentation types 6- Navigation 7- Promotion
  • Mobile Site Design

Availability and accessibility:

  • The extent to which a system is usable by users is a key performance indicator for that site. No matter how functional a site is in the background, if it is not easy for users to access and use these functions, the failure of the site is inevitable. This may lead to customer loss if the end user is a customer, or to low productivity if the end user is an in-house employee.
  • Interface design principles: The first step in a design process is to understand the work that is the subject of the design. Afterwards, a layout plan suitable for the interface structure of the system should be created. These plans should be in a way that will maximize users’ content awareness. (i.e: Three click rule)

Application and Sustainability:

Coding is the translation of program logic into specific instructions so that computers can understand it.

  • Readable
  • Moveable
  • Common/General
  • Simple / Bald
  • Common use

System Security: System security has many components. But the three most basic components are integrity, confidentiality, and availability (CIA). Confidentiality protects the system from unauthorized access and prevents the system from being disclosed. Integrity does not allow unauthorized users to create, edit, or delete information. Availability ensures that authorized users can access appropriate content for which they are authorized, at any time, without any questions.

  • System Security level: Physical- Network- Application-Document-Procedures-Users (No Logical)
  • System Tests: Unit testing is testing a program or module that performs a specific function. If the output of one program or module will create an input for the other, the integration between these two groups is expected to be perfect. In this sense, integration tests are performed. System testing can be thought of as integration testing of an entire software. An acceptance test refers to a type of testing performed to determine if the requirements of a specification or contract are met. I

Installation: During direct installation, after the installation of a system is completed, the existing system (if any) is terminated and the new system is immediately activated. In parallel installation, the system is activated after a certain installation phase. In phased installation, the system is installed step by step on a modular basis.

Sustainability:

  • Documentation: System and user documentation
  • Education and support
  • Performance management: Error management, workload management, capacity management
  • Maintenance: Corrective, adaptive, remedial, and preventive maintenance
  • Backup and Recovery:

--

--

Samet Girgin
PursuitOfData

Data Analyst, Petroleum & Natural Gas Engineer, PMP®