An activity is a set of actions which are performed by Agents. The activity can be atomic or grouped. This definition is compatible with a BPMN activity with the exception that the agent must be a business process (“An Activity is work that is performed within a Business Process. An Activity can be atomic or non-atomic (compound). [...]”).
The definition of LRM activities also involves triggering events (time instant) that start and stop any activity, and offer mechanisms to relate the activities to the events they triggered, the resources they used and transformed and the other activities they started, stopped or suspended/resumed. Thanks to this temporally defined events, the time information related to an activity can be expressed in term of beginning and ending time, as well as its duration time.
Anything (person, organisation, hardware, software) that incurs changes or plays an important role in an ecosystem. Am agent can cause a change to an ecosystem by performing activities.
In LRM, an agent is considered as the “bearer of change”, and therefore establish the conceptual link between the changes observed or performed on the resources composing the digital ecosystem
An anomaly is a deviation of the value of a parameter from the nominal or expected values in the telemetry of a space instrument. Anomalies might be within limits and then do not require action from the operators or above limits prompting corrective action. An anomaly detection software covers both case and can be used to follow instrumental degradation and thus in case of accelerated degradation alert on possible future failure.
Appraisal determines which data should be kept, typically based on criteria that are contained within collection policies. The criteria for accepting or rejecting content are difficult to describe rigorously, which makes appraisal processes difficult to reproduce and automate. PERICLES aims to investigate ways of automating or guiding the appraisal process. Appraisal can be divided into technical appraisal and content based (intellectual) appraisal:
“The degree to which a person (or system) regards an object as what it is purported to be. Authenticity is judged on the basis of evidence.” (OAIS definition, OAIS model CCSDS 650.0-M-2 p. 1-9).
Business Process Management (BPM) is an approach for consistently improving processes and business activities of an organisation.
A Business Process Management System (BPMS) is a software that allows to execute business processes (see process). It is similar to a Workflow Management System, but a BPMS can provide additional components like workflow system, dashboards, reporting and other tools.
Business Process Model and Notation (BPMN) is agraphical representation for specifying business processes in a business process model. It is similar to flowcharts and has a set of elements that can be used. The latest BPMN specification 2.0 is detailed enough to either just create models that visualise a process flow or to create a process which can be executed on a process engine. PERICLES uses BPMN to describe individual process models.
Change is a generic term (“an act or process through which something becomes different” (http://www.oxforddictionaries.com/definition/english/change)), but for the Digital Ecosystem we consider these categories of change:
Change management describes the processes and methodologies involved in controlling and tracking the evolution of digital ecosystem components. One objective is to offer reasonable level of guaranties regarding the consistence of the preservation system (so that important functionalities remains operational), but also to handle provenance related knowledge and offer understandability regarding its evolution (through history recording).
Change propagation, depending on the chosen level, consists in analysing or enacting the effects of a change that is whether changes to one entity in an ecosystem may result in cascading changes for other dependent entities. The basis for this kind of operation is a dependency graph, and requires, at each step of the propagation, analysing/validating the preconditions and analysing/performing the transformative actions (evaluating the impact) such as described by the LRM dependency model.
Change may also occur in the larger social or cultural context common to the institution and the user communities. Examples are laws, disciplines or cultural norms which require changes for the institutions and users.
In the context of PERICLES a community of practice group is a group of selected individuals from various professional roles who are expert practitioners representative of a given industrial, cultural or academic field who are periodically gathered together to discuss a fundamental question or set of issues relevant to the preservation of digital objects.
Context of a digital object is anything external to the object itself that can affect its interpretation. Examples include technical context (the technical environment in which the digital object can or should be created, used or modified); semantic context (the meaning of the terms, variables and entities used by the digital object); domain or intellectual context (the necessary background information to use or understand a digital object).
References: SHAMAN D3.1 Preservation Context Model
Context of use (or use-context) is a specific type of context referring to information external to the DO that is relevant to the DO's use. Thus, a DO can have various different use-contexts, depending each time on the associated activities where the item plays part in. (References: PERICLES D4.3 and D4.4.)
An activity that tries to maintain and enhance the value of the data. Similar to long-term preservation but intellectual modification and enrichment are not precluded because data curation has different aims. E.g. measures may include correcting errors, versioning, updating terminology, adding measurement results to a time series.
The reversal of an information encapsulation process. The extent to which the original information units can be restored correctly depends on the characteristics of the information encapsulation technique used. It is not always possible to restore all original information units.
Metadata which is necessary for decapsulation is called restoration metadata.
Given objects A and B. A is dependent on B if changes to B have a significant impact on the state of A, or if changes to B can impact the ability to perform function X on A. LRM defines dependencies as a directed graph between potentially several dependees and several dependants. A dependency is also characterized by an intention, logically described by a specification; it may be also associated with a pre-condition that expresses contextual prerequisite for this dependency to be activated, and an impact plan that expresses the transformative actions that must be undertaken when it has been activated). LRM also make a distinction between conjunctive dependencies (if one dependee changes, the dependency must be activated, provided its preconditions are satisfied) and disjunctive dependencies (if all dependees change, the dependency must be activated, again, only if its preconditions are satisfied).
There are many different definitions and approaches to define, describe and model digital objects, for example:
The definitions above are generally compatible. It would be expected that any item available digitally (2) is composed of a set of bit sequences (3). It is conceivable that a digital object could be defined in terms of its content (with this definition possibly associated with a unique identifier) and the bit sequences could be created or recreated on demand, but this would be very unusual. The definition (1) imposes the additional requirement that the object have a unique identifier, but since it is possible to have objects that satisfy (2) and (3) that do not have this property, we do not consider this to be a defining feature.
We therefore define a digital object following (2) above as “any item that is available digitally”.
There are several aspects to what a digital object is, and a useful categorisation is provided by Thibodeau (http://www.clir.org/pubs/reports/pub107/thibodeau.html):
A digital object has at least three aspects:
In PERICLES "digital ecosystem" should be interpreted as a concept with the purpose for analysing and modelling the ability of infrastructure to maintain the usefulness of digital objects. We define a digital ecosystem in the following way: A digital ecosystem consists of a set of the entities and dependencies influencing or necessary for a successful use at a later point in time.
The term ecosystem is the environment that is consulted for further analyses. It is similar to the meaning of the term Device Under Test (DUT) from electronics testing. A DUT is a single component, assembly group and whole devices. Ecosystem is equivalent to this, it can be a single software component or a process or a software system or a whole institution with all software components, human processes and organisation depending on the desired granularity.
The ecosystem can be modelled with the Digital Ecosystem Model.
It is a formal model by the PERICLES project to describe a Digital Ecosystem (see term Digital Ecosystem. The model is not a blueprint for what a technical infrastructure or even a dedicated preservation system should look like. Instead it is a snapshot of the current infrastructure and situation irrespective of whether its aim is to preserve anything at all and irrespective of whether it is effective or dysfunctional. The purpose of the digital ecosystem model is to provide a way to manage them.
In PERICLES a "Digital Ecosystem Model" is interpreted as consisting of different types of entities with different dependency relations between them:
The Digital Ecosystem model provides all components and concepts to model a Digital Ecosystem. A concrete model of a real ecosystem is called Digital Ecosystem Model Instance.
See Digital Object.
Domain ontologies are specific ontologies that are used for modelling the respective domains (i.e. case studies). They are not necessarily aimed at exhaustively modelling the respective domains, but are used for modelling the Digital Preservation related interdependencies between ecosystem entities, along with the impacts of change. The other ontologies (LRM, DEM) provide a set of more generic concepts and may be extended by a domain ontology.
It is a tool for creating Digital Ecosystem Model instances. It provides a GUI and a Java API that can be used by other software components. EcoBuilder creates scenario-based instantiations of the Digital Ecosystem Model (DEMI). EcoBuilder can apply entity templates for the well-defined instantiation of the model and its entities that take care of LRM modelling conventions, so that a user who creates an ecosystem model instance can use this higher level of abstraction.
See Digital Ecosystem.
An entity denotes everything that exists and can be in a material and immaterial form. The word entity comes from the Latin word ēns and means being. It is an umbrella term and often used in the IT world. Depending on the application area it may have a special meaning in IT. For example entity in database theory means a piece of information that can exist independently and can be uniquely identified (like person, address etc.).
We understand entity as a general term for a component inside a model.
ERMR is a software component that supports registration and storage of entities. The purpose of registration is to store salient information about entities, i.e. metadata, semantics, authentication information, dependencies, notification information, pre-defined queries that run on the entity store and other things. The registry is thus the combination of the rules and conventions (the schema) for information stored in the registry. The model repository is the place where the models are stored.
The entity store offers an abstracted unified interface on top of the used storage technology like iRODS, Cassandra, S3 or a triple store, i.e. the abstraction is constant and access does not depend on the underlying technology. The entity store is a place that can store everything which is available digital. To retrieve items it is necessary to identify them. The entity store does not know what type of information is being stored. This means that the entity store cannot infer any semantics on the data because it does not interpret the content.
See Significant Environment Information (SEI).
In the digital realm it is not pre-defined what "identity" means and it has to be defined for a context. It is closely related to how we define a digital object. What is important for us is to tackle when the identity changes (i.e. implies a new or modified resource).
It is possible to assume multiple identity definitions, each providing its features and advantages. Examples are:
References: Digoiduna project: technical features https://sites.google.com/site/digoiduna/Ontology/features, digital object identifier systems http://www.digoiduna.eu/documentation/pi_architectures
The fusion of two or more information units, whereby one unit serves as carrier and the other(s) as payload. During the embedding process the payload is embedded in the carrying information with a resulting fused information unit. Information Embedding includes techniques of Steganography, Digital Watermarking, Information Frames, and the use of file format features for embedding.
The aggregation of related information, so that they can be handled as single information unit. Information Encapsulation combines the terms Packaging and Information Embedding. Decapsulation means the reversal of the encapsulation process.
Information frames consist of the same medium as the carrier information and are attached to these without modifying it. A frame can consist of additional pixels for images, an additional soundtrack for audio files, or an additional sequence in a film. A popular example is the closing credits of a film.
The grouping of different activities and events dealing with an entity into potentially repeating phases. Usually ranging from conception, creation, and usage to deletion or modification (and thereby becoming the basis for another entity). Mostly used for digital objects, but policies and services are examples of entities where the notion of a lifecycle also makes sense.
References: DCC Curation Lifecycle, SHAMAN policy lifecycle
Long-term preservation can be divided into three categories:
A meta model is the model about the model. A meta model defines which types of entities can be used inside a model and how they relate to each other. There can be many inherited layers that finally form a model.
A model allows to capture, visualise and analyse a scenario. It is an abstract representation of (some aspects of) the real world
Uses of models (IT perspective):
Models can have different granularities. Some are very detailed and others are more abstract. A model has a certain structure, elements and rules how the elements can be used. One might call this meta-model, it is the model of the model (see also Meta-model). To make use of this meta model it is necessary to define the elements and relations depending on what should be captured in the model.
Besides the general capability of expressing granularity of a model it is possible to build hierarchies with the models. The models can be extended whereas each layer becomes more concrete or specialised.
MICE is a client application of the ERMR (qv) that visualises the impact of change on the ecosystem’s entities. The aim of this tool is to assist the persons that manage an ecosystem in evaluating and understanding how a potential change to an element will impact the overall ecosystem.
See Process Compiler.
We draw a distinction between a model schema (which mainly defines class hierarchies in order to help describing the knowledge of a particular domain) and a model instance (which defines individuals belonging to those classes, i.e. a particular description of the targeted domain, compliant with the vocabulary defined by the schema). See Linked Resource Model Instance and Digital Ecosystem Model Instance for examples of model instances.
A place where models are kept. See ERMR.
A digital preservation system is always made up of components: objects in the archives, users, usage scenarios, etc. The question is whether these components add up to more than just their sum. Ncpol2spda is a software tool for the macroscopic scanning of datasets to detect non-classical correlations. Such correlations belong to a class called entanglement in quantum mechanics, and indicate quantum-like behaviour in the data, e.g. in co-citations. Ncpol2sdpa uses the method of sparse semidefinite programming relaxations for polynomial optimization problems of noncommuting variables to isolate them to exclude hidden variable theorems in networked data and verify the strength of observed correlations. It is written in Python and while it is not an accelerated tool in itself, it acts as a necessary preparatory step to solve extreme-scale semidefinite programs on accelerators.
“In the context of computer and information sciences, an ontology defines a set of representational primitives with which to model a domain of knowledge or discourse. The representational primitives are typically classes (or sets), attributes (or properties), and relationships (or relations among class members). The definitions of the representational primitives include information about their meaning and constraints on their logically consistent application.”
Ontologies are widely used in PERICLES at different levels. There are ontologies that provide a set of high-level concepts, like the LRM or DEM. A concrete scenario that uses the concepts is an instance (see LRMI or DEMI).
For ontologies that are specific to a domain or representation, we use the term “Domain Ontology” to mark the specific ontologies that contain concrete information and data for our use cases.
The aggregation of two or more information units by packing them in an information container.
The PERICLES Content Aggregation Tool is a framework for information encapsulation techniques. It provides a questionnaire for capturing the user scenario, and a scenario based decision-mechanism which outputs a high score of information encapsulation techniques. The techniques can be used from within the framework.
Library containing algorithms for detecting high-level visual concepts in images.
The PERICLES Extraction Tool (PET) is a software tool to extract information from the environment where digital objects are created and used. It is a generic, modular framework that can be adapted to support different use cases and domains through specific modules and configuration profiles. In a nutshell, PET works by analysing the use of the data from within the creator or consumer environment, extracting information useful for the later reuse of the data that is not possible to derive later.
PET is open source software (Apache licensed) written in Java and available on GitHub at https://github.com/pericles-project/pet.
A persistent identifier is an artificial property that allows to persistently and uniquely identifying an object.
PET2LRM is a tool that converts the output of the PERICLES Extraction tool (PET) into LRM compatible instances.
PROPheT (PERICLES Ontology Population Tool) is a novel GUI-equipped instance extraction engine, for locating instances (realisations) of concepts and relations in a Linked Data source (e.g. DBPedia), filtering them and subsequently inserting them into a domain ontology. PROPheT offers three types of instance extraction-related functionalities (class-based populating, instance-based populating, instance enrichment) along with user-driven mapping of data properties and is flexible enough to work with any domain ontology (written in OWL) and any RDF Linked Data set that is available via a SPARQL endpoint.
A policy is a guideline or a goal that defines the desired state inside an ecosystem, expressed with constraints. A policy describes the 'what' (guidelines) and not the 'how' (implementation).
A policy may describe things that need to be done (it could dictate that a specific process is used) but would not normally go into detail about this process It cannot be assumed that a policy gives any information about how it should be applied, enacted or enforced.
Draft definitions of actionable policies for preservation management have been published by the RDA Practical Policy Working Group:
QA of Policies: This is a special form of quality assurance that has the aim to validate the correct application of policies in the ecosystem, through a set of QA criteria that apply to the ecosystem entities.
The Policy Editor is a tool for editing low-level “concrete” policies. The policies are predefined as templates that can be customised by using the Policy Editor.
A policy model is a model to describe policies and the related information useful for their implementation and Quality Assurance hierarchically. The PERICLES policy model is described in D5.2 and D5.3 in a general form, and in the Ecosystem model based on LRM, and makes use of the Digital Ecosystem and dependency concepts and supports policy derivation.
Preservation by design is an approach to capturing and modelling not only the digital content itself, but also the surrounding digital ecosystem and its evolution, with the purpose of facilitating future accessibility and reuse.
A preservation system is a dedicated system to preserve Digital Objects and offer access to the objects.
These are implementations of requirements from the use cases that provide stakeholders with information about the quality of a specific feature or functionality.
A process transforms an input to a certain output by linked activities and can invoke other systems or need human interaction. A process describes how something should be done (including the circumstances in which it should be done) and define individual steps for achieving the goal. They can be described in varying degrees of granularity, formality and exactness, e.g. as free text; flowchart; formal or in an executable form that computer executable process. A process can be regarded as operationalization of policies.
A business process is similar to a generic process, but indicates that a process is performed by an organisation/company to produce value for the organisation.
A process is similar to a workflow, there is no unique definition. Sometimes it is used as synonym, sometimes not. A workflow may be seen as a more generic way to describe how an organization/company is organized to perform a certain activity. In contrast a process is more focused on describing the detailed series of actions that are necessary to perform a result.
Takes one representation of a process model and transforms it into another form that can be executed by the workflow engine on the testbed. It closely collaborates with the ERMR by converting process model descriptions from ERMR into executable BPMN test scripts. The Process Compiler offers a web-service deployment possibility, an API plus a connection for querying ERMR.
The PERICLES Process Compiler allows to transform and combine RDF-based descriptions of preservation processes (process entities in the digital ecosystem) into executable workflows (i.e. BPMN processes) that can be orchestrated and executed by the workflow engine on the testbed.
A process entity is an entity of the digital ecosystem that represents the high-level description of a process: which function it performs, which data it consumes and produces, and which services it runs, while delegating the low-level details to be described separately in an associated entity (implementation entity) as an executable file using a well suited notation (e.g. BPMN)
A process model refers to a certain model that allows to describe processes. BPMN is for example a wide used standard for describing process models. It provides a graphical notion for processes, but such a model can also be used as an executable model by a BPMN compatible interpreter.
“Program for the systematic monitoring and evaluation of the various aspects of a project, service, or facility to ensure that standards of quality are being met” (Webster)
In the context of PERICLES, QA is aimed at validating the correct application of policies, and other ecosystem entities, using QA criteria and the ecosystem model, as described in D5.2 and D5.3.
The Resource action language ReAL describes transformative actions on LRM based models. Its syntax is based on simple basic expressions composed through stream-oriented logical operators. ReAL is designed to handle dynamicity in RDF stores thanks to a much more adapted expressive power than standard alternatives based on production rules. Actions are logical combinations of RDF triple queries, insertion and deletion instructions and aim at updating the model, mainly in reaction to changes in the target ecosystem. Actions are triggered by events, and most importantly, can be combined within nested transactions in order to ease the specification of context-aware and globally consistent RDF modifications. ReAL is executed within the Linked Resource Model Service that embeds an interpreter.
“Information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business” (ISO 15489-1:2001)
“field of management responsible for the efficient and systematic control of the creation, receipt, maintenance, use and disposition of records, including processes for capturing and maintaining evidence of and information about business activities and transactions in the form of records” (ISO 15489-1:2001)
Anything that can be identified through a URL/URI respectively. In this sense e.g. a policy, a digital object, an information object, a process, a digital asset, etc. are also types of resources. It is not a resource in the sense of necessary resources to execute a computer process (e.g. CPU time, storage, network connection, web services).
Restoration metadata refers to the set of metadata which is necessary for a successful decapsulation of encapsulated information. This encompasses checksums, original file paths, encoding information, an identifier unambiguously identifying the algorithms applied for encapsulation, payload localisation information and the configuration parameters to be used for the decapsulation algorithm.
Risk is defined as the “effect of uncertainty on objectives” (ISO Guide 73:2009 (2009) Risk management – Vocabulary, International Organization for Standardization). The aim of risk management is to support the identification, assessment and mitigation of risks.
Semantics is the scientific study of meaning, a discipline 2000 years old. It has two main branches, word semantics and sentence semantics. As both pertain to natural as much as artificial languages, it is significant for new disciplines like computer science and artificial intelligence, language philosophy, computational linguistics, semiotics, etc. As a consequence, any natural and artificial language-dependent applications have a semantic aspect too. With both culture, language and applications evolving all the time, capturing and handling this aspect is becoming an important methodological question over a range of subject areas, e.g. dynamic semantics and update semantics in linguistics and language philosophy.
As change affects all systems, including human societies and their products, these are prone to impacts from the outside world. To put simply, changes in the context can modify the meaning of any content embedded in that context. Therefore e.g. terminology, practice, organisational solutions, policies, requirements, technology, entities and their dependencies, ultimately semantics itself is no exception to the rule, which makes the interplay between changes and semantics a delicate area of study for digital preservation, and a key objective for PERICLES.
As a result, we distinguish between the change of semantics vs. the semantics of change, the subtle difference being that the former pertains to any modifications of word or sentence meaning a natural or artificial language describing content may be exposed to, much like mutations modify the functional “meaning” (i.e. the consequences they induce) of biological agents like genes. On the other hand, the latter refers to the LRM as an artificial language able to define the meaning of its entities and the relations between them (the ontology part), and generate statements, i.e. sentence meanings with such ontological entities and relations. By treating change as the object of study, the LRM creates a metalevel description of semantics which is both a language and a modifiable description.
The branch of semantics used to create ontologies is called formal semantics, going back to Carnap's positivist logical semantics. Whereas this was preferred in WP3, in WP4 a different kind called the theory of semantic fields (Trier 1934) has been used to study localized value changes in a field modelled on physics. Due to the two related methodologies, semantics and terminology are not different -- rather, terminology is the study of index terms, i.e. words, and belongs to word semantics in linguistics, whereas logical propositions, predicates etc. are sentences that belong to sentence semantics. We need the above distinction to separate sentence semantics from word semantics, where ontologies and the LRM address both, whereas field theory in its current version addresses word semantics (i.e. index terms) only, but both models can handle change.
The same as semantic change, although more grandioso, plus invites biological annotations and has fitting mathematics for formalization. One might say evolution is the sum total of change processes where a change process is the propagation of one or more local mutations (local value modifications).
Covers the description, understanding and modelisation of what a change means regarding the ecosystem preservation goal. It particularly focus on the impact of change with respect to global properties such as consistency, time coherence and generalized provenance management (includes tracking and capturing the causal relationships between changing objects). At the level of a preserved object, an important component to describe the semantic of change is the significant property (see below), which is actually an invariance property, i.e. the definition of what must not be affected by changes for the object to keep its essential nature.
The widest set of information related to a digital object is its environment information. This information is different from the Ecosystem as it is by definition related to an entity (the environment of an entity) and is not defined on his own. We consider environment information to include all the entities (DOs, metadata, policies, rights, services, etc.) potentially useful to correctly access, render and use in different ways the DO in different communities.
Given this definition of Environment information, we define Significant Environment Information as follows:
A situation in which curation activities are integrated into the workflow of the researchers creating or capturing data. The word ‘sheer’ here is used to describe the ‘lightweight and virtually transparent' way in which these curation activities are integrated, with minimal disruption (definition by Alistair Miles of the Science and Technology Facilities Council, UK), http://alimanfoo.wordpress.com/2007/06/27/zoological-case-studies-in-digital-curation-dcc-scarp-imagestore/.
“Significant properties are those aspects of the digital object which must be preserved over time in order for the digital object to remain accessible and meaningful” (Inspect project: http://www.significantproperties.org.uk/)
An important aspect of SP is that significance is not absolute; a property is significant only relative to (e.g.) an intended purpose or a stakeholder or some other way of identifying a viewpoint. It is also important to note that SP have been generally referred to intrinsic properties of Digital object, as opposed to the extrinsic aspects considered with SEI.
Significant properties are strongly connected with the notion of invariant/specification involved in the LRM versioning model, since both aims at characterizing the essential property of a resource.
In the scope of Significant Environment Information, significance weights express the importance of individual dependencies for a specific purpose. These weights can be represented by a single value, or by a set of parameters with specific semantics and a formula to compute an overall weight.
A task is a piece of work that cannot be (or, in the context of a particular scenario, need not be broken down any further). This term is often used inside a process model to annotate the granularity of an item in a process chain. In contradiction an activity consists of a series of tasks.
A Technical Service is a term that includes hard- and software plus any kind of interface. The software typically governs the behaviour of a technical service. A technical service can provide different services, where here we use “service” to mean any operation that a technical service offers to the outside. It can be a user interface, a service that provides value an organisation or technical services used for automated machine to machine communication.
The PERICLES test bed is a reconfigurable experimental platform developed in the PERICLES project with the purpose of executing, validating and evaluating Test Scenarios. It integrates a common set of preservation tools developed within work packages WP3-WP6, together with additional third-party tools as appropriate. The tools are tested with scenarios drawn from the media and science case studies. Lessons learned from the test-bed help identify best practice guidelines for the development of certain services.
We use “User Community” to mean a designated community as defined by OAIS (or a subset of a designated community). "Designated Communities" is a concept of the OAIS to denote a group of users which influence the criteria for successful preservation. "An identified group of potential Consumers who should be able to understand a particular set of information. The Designated Community may be composed of multiple user communities. A Designated Community is defined by the Archive and this definition may change over time." (References: http://public.ccsds.org/publications/archive/650x0m2.pdf p. 1-11)
A workflow consists of an orchestrated and repeatable pattern of individual business processes, typically implemented by technical services. Workflows can be expressed in business process orchestration languages such as BPEL or BPMN. A workflow is performed by agents, e.g. institutions and the associated employees and describes how work is organised or which steps are needs to be executed to achieve a certain result. The steps can include processes as part of the workflow. See also process.