Glossary

Contains definitions of terms used in eGovPoliNet partly based on DCMI Metadata Terms.

 Agent-based Modelling
Gilbert defines Agent-based modelling (ABM) as "a computational method that enables a researcher to create, analyse, and experiment with models composed of agents that interact within an environment” (Gilbert 2007). The ABM approach facilitates investigation of social dynamics, where a collection of agents is acting autonomously in various (social) contexts. The massive parallel and local interactions of agents can give rise to path dependencies, dynamic returns and their interaction. In such an environment, global phenomena as the development and diffusion of technologies, the emergence of networks, herd-behaviour etc. which may caused a transformation of the observed system can be modelled. The approach focuses on depicting the agents, their relationships and the processes governing their transformation, i.e. the social behaviour of the agents is at forefront of consideration.
The application of ABM offers two major advantages (Gilbert and Troitzsch 1999):

- capability to show how collective phenomena come about and how the interaction of the autonomous and heterogeneous agents leads to their genesis. ABM supports isolation of critical behaviour in order to identify agents that more than others drive the collective result of a (dynamic) system. Such simulations also endeavour to single out points of time, where the system exhibits qualitative rather than sheer quantitative change.
- possibility to use agent-based models as computational laboratories to explore various institutional arrangements, potential paths of development so as to assist and guide e.g. firms, policy makers etc. in their particular decision contexts. References:
Gilbert, N. and Troitzsch, K. (1999) Simulation for the Social Scientist, Berkshire, UK: Open University Press.
Gilbert, N. (2007) Agent-Based Models, London: Sage.
 Business Process
According to Davenport, a business process is a structured set of activities designed to produce a specific output (Davenport, 1993). For Hammer and Champy (1993), a process is defined as “a collection of activities that takes one or more kinds of input and creates an output that is of value to the customer. A business process has a goal and is affected by events occurring in the external world or in other processes."
Smimov argues about a modern business process to be seen as a distributed system where its activities are performed by various employees, on different locations, using a heterogeneous set of IT systems. A business process typically crosses the borders of organisational departments and even companies (Smimov, 2012).
Business process can be modelled. To this end, (Smimov, 2012) defines business process models as key artefacts to represent how work is performed in organisations. These models can help an organisation to document, evaluate, or improve their business processes.
References:
Davenport, T.H. (1993), Process Innovation, Harvard Business School Press, Boston, MA.
Hammer, M., and Champy, J. (1993). Reengineering the Corporation: A Manifesto for Business Revolution Harper Business.
Smirnov, S., Reijers, H., Weske, M., and Nugteren, T.. (2012), Business process model abstraction: a definition, catalog, and survey, J Distributed and Parallel Databases, V 30, N 1, p. 63-99.
 Community
The word community was derived from the Latin communitas, a broad term for fellowship or organised society. According to Merriam-Webster Dictionary (2013) a community is a unified body of individuals, e.g. the people with common interests living in a particular area or a group of people with a common characteristic or interest living together within a large society.
Community usually refers to a social unit that shares common values (Smith, 2013). Specifically in biology a community is a group of interacting living organisms sharing a populated environment.
Tönnies (2005) distinguishes two types of human association in sociology - community and society. He argues that community is perceived to be a tighter and more cohesive social entity (presence of a "unity of will"). Perfect expression of community is family and kinship. Society, on the other hand, is a group in which the individuals who make up that group are motivated to take part in the group purely by self-interest. As Tönnies proposed, in the real world no group was either pure community or pure society.
Community building is a field of practices directed toward the creation or enhancement of community among individuals within a regional area or with a common interest.
References:
Merriam-Webster.com (2013), Community. Retrieved April 17, 2013
Smith, M. K. (2001), ‘Community’ in the encyclopedia of informal education, Retrieved April 17, 2013
Tönnies, F. (2005), Gemeinschaft und Gesellschaft, Darmstadt: Wissenschaftliche Buchgesellschaft, 8th edition (reprint).
 Complex Adaptive System (CAS)
Government officials and other decision makers increasingly encounter a daunting class of problems that involve systems composed of very large numbers of diverse interacting parts (Shalizi, 2006). These systems are prone to surprising, large-scale, seemingly uncontrollable behaviors. These traits are the hallmarks of what scientists call complex systems. A complex system is composed of many parts that interact with and adapt to each other and, in so doing, affect their own individual environments. The combined system-level behavior arises from the interactions of parts that are, in turn, influenced by the overall state of the system. Global patterns emerge from the autonomous but interdependent mutual adjustments of the components (Jacobson et al., 2011).
According to John Holland CAS is a special category of complex systems dealing with living systems that have the capacity to change, learn from experience and sometimes forecast (Holland, 1999). The control of a CAS tends to be highly dispersed and decentralized. If there is to be any coherent behavior in the system, it will have to arise from competition and cooperation among the agents themselves. The overall behavior of the system is the result of a huge number of decisions made every moment by many individual agents. (Holland, 1992, p. 17). Typical phenomena in complex adaptive systems are the emergence of macro-level structures due to interactions at the micro-level (self-organisation). These macro-structures in turn determine the behavioural freedom at the micro-level (downward causation).
Related terms: Complex System
References:
Holland, J. H. (1992), Complex Adaptive Systems, Daedalus, Vol. 121(No. 1), pp. 17-30.
Holland, John H. (1999), Emergence: from chaos to order, Reading, Mass: Perseus Books.
Jacobson, M., Kapur, M., So, H.-J., & Lee, J. (2011). The ontologies of complexity and learning about complex systems. Instructional Science, Vol. 39(No. 5), pp. 763-783. doi: 10.1007/s11251-010-9147-0
Shalizi, C. R. (2006). Methods and Techniques of Complex Systems Science: An Overview Complex Systems Science in Biomedicine. In T. S. Deisboeck & J. Y. Kresh (Eds.), (pp. 33-114): Springer US.
 Conceptual Model
In science, there is the need to formally describe some aspects of the physical and social world around us for purposes of understanding and communication. Such descriptions, often referred to as conceptual schemata, require the adoption of a formal notation, namely a conceptual model (Mylopoulos, 2012). By assuming that any given domain consists of objects, relationships, and concepts, we commit ourselves to a specific way of viewing domains, the conceptual model namely. The set of concepts used in a particular domain constitutes a conceptualisation of that domain. The specification of this conceptualisation is sometimes called an ontology of the domain (Olive, 2007).
A concept has an extension and an intension. The extension of a concept is the set of its possible instances, while the intension is the property shared by all its instances. Concepts allow us to classify the things that we perceive as exemplars of the concepts that we have. In other words, what we observe depends on the concepts that we employ in the observation. Classification is the operation that associates an object with a concept. The inverse operation, instantiation, gives an instance of a concept. The set of objects that constitutes an instance of a concept at a given time is known collectively as the population of the concept at that time (Olive, 2007).
A conceptual model brings several advantages including: the documentation and the re-use of the domain knowledge, the reach of consensus among domain experts and IT stakeholders and the provision of a firm basis for the design and development of ISs within the domain (Wand and Weber, 2002).
To the best of our knowledge, limited work on conceptual models for the policy modelling domain has been proposed so far (until June 2013). The Consistent Conceptual Description CCD (Wimmer and Scherer, 2011) presents the vocabulary to describe policy contexts, and it describes how CCD supports the semi-automatic transformation of conceptual models within a policy context to generate formal policy models. Another preliminary conceptual model has been proposed by Wyner et al (2011) in the form of an ontology.
References:
Mylopoulos, J. (2012) Conceptual Modelling and Telos
Olivé, A. (2007) Conceptual Modeling of Information Systems. Springer, Heidelberg.
Wand and Weber (2002) Research Commentary: Information Systems and Conceptual Modeling - A Research Agenda. Information Systems Research, 13(4), pp. 363-376.
Wimmer M. and Scherer S. (2011). Conceptual Models Supporting Formal Policy Modelling: Metamodel and Approach. Modelling Policy-making (MPM 2011).
Wyner A., Atkinson K. and Bench-Capon T. (2011). Semantic Models and Ontologies for Modelling Policy-Making. Modelling Policy Making (MPM 2011).
 Conceptual Modelling
Conceptual modelling is the process of abstracting a model from a real or proposed system (Robinson 2008, p. 3). Mylopoulos (1992) defines conceptual modelling as an activity of formally describing some aspects of the physical and social world around us for purposes of understanding and communication. The outcome of the conceptual modelling process is a conceptual model. Conceptual modelling is an iterative and repetitive process, with the conceptual model being continuously revised throughout the modelling process. However, the main issue in conceptual modelling is to abstract an appropriate simplification level of reality (Pidd, 2003).
Conceptual modelling is a complex process because we do not have measurable criteria for evaluating the value of its outcome - a conceptual model (Pritsker 1987). Therefore, during the process of conceptual modelling, a set of system requirements would be useful to consider. The requirements could provide a basis against which to determine whether  the obtained conceptual model is appropriate. Robinson (2008, p. 19) argues four main requirements, which should be fulfilled when measuring the outcome of conceptual modelling:

- validity (a conceptual model can be developed into a simulation model with sufficient accuracy),
- credibility (similar like validity, but from the viewpoint of a client),
- utility (developed model will be useful for the decision making),
- feasibility (conceptual model will be developed into a [simulation] model with respect to available time, resources and data). In Policy Making, conceptual modelling is carried out by policy analysts who extensively analyse available documents in order to get an accurate overview of the policy domain, i.e. to develop a conceptual model of it. They also collaborate with the stakeholders and the policy modellers to discuss model elements.
Related terms: Model, Modelling, Tool
References:
Mylopoulos J, (1992). Conceptual modeling and Telos, Chapter 2 in Loucopoulos, Peri; Zicari, Roberto: Conceptual Modeling, Databases, and CASE : An Integrated View of Information Systems Development, New York.
Pidd, M. (2003). Tools for Thinking: Modelling in Management Science, 2nd ed. Wiley, Chichester, UK.
Pritsker, A.A.B. (1987). Model Evolution II: An FMS Design Problem. Proceedings of the 1987 Winter Simulation Conference (Thesen, A., Grant, H. and Kelton, W.D., eds.). IEEE, Piscataway, NJ, pp. 567-574.
Robinson, S., 2008. Conceptual modelling for simulation part I: definition and requirements. Journal of the Operational Research Society, 59 (3), pp. 278 - 290.
 Declarative Model
Declarative model is a model that expresses the logic of a computation without describing its control flow. The model attempts to minimize or eliminate side effects by describing what the program should accomplish in terms of the problem domain, rather than describing how to go about accomplishing it as a sequence of the programming language primitives. The declarative model is in contrast to imperative, because in imperative model algorithms are implemented in terms of explicit steps. Generally, the declarative model is mathematical representations of physical systems implemented in computer code that is declarative. The code of the model contains a number of equations, not imperative assignments, that declare or describe the behavioral relationships.
Two classes of declarative models are systems dynamics models and declarative agent-based models, as mentioned in Villa et al. (2006). The processes leading to outcomes and the outcomes themselves emerge from simulations with these models. As mentioned in Fahland et al. (2009) the principal difference between declarative agent-based models and systems dynamics models is that the agent-based models describe social interactions whilst the systems dynamics models do not. Agent-based models  in which the behaviour of agents is determined by if-then rules are inherently declarative.
References:
Fahland, D., Lübke, D., Mendling, J., Reijers, H. Weber, B., Weidlich, M., Zugal, S.(2009). Declarative versus Imperative Process Modeling Languages: The Issue of Understandability. In: BPMDS 2009 and EMMSAD 2009, LNBIP 29, pp. 353-366, Springer-Verlag Berlin Heidelberg.
Villa, F., Donatelli, M., Rizzoli, A., Krause, P., Kralisch, S., van Evert, F. (2006). Declarative modelling for architecture independence and data/model integration: a case study. In: iEMSs 2006 Summit on Environmental Modelling and Software. [Online] http://www.iemss.org/iemss2006/papers/s5/278_Villa_1.pdf (verified on November 7, 2013).
 Democratic Governance
Democratic Governance implies good governance from the human development perspective (UNDP, 2002). Democratic Governance concept refers to a governance process that is based on fundamental and universally accepted principles, including: participation, accountability, transparency, rule of law, separation of powers, access, subsidiarity, equality and freedom of the press (United Nations Economic and Social Council, 2006).
Related terms: Good Governance, Governance, Policy Governance
References:
UNDP (2002), Human Development Report 2002. Deepening democracy in a fragmented world. New York Oxford University Press
United Nations Economic and Social Council. (2006). Definition of basic concepts and terminologies in governance and public administration. pdf
 Discipline
A discipline is characterised in scientific literature as a recognised approach to a specific issue. It enables in depth reflection and discussion to take place. Generally, disciplines are closed environments that develop common epistemologies and ontologies (Barrett, 2012).
Disciplines in academia are well-established spaces that are often institutionally defined (Departments, Faculties, etc.) (Chettiparamb, 2007). Teaching and research are often executed in single-disciplinary studies, as 'expertise' is determined by recognition from peers from the same discipline.
An understanding that different disciplines must work together has emerged. This can operate in different ways: multidisciplinary, interdisciplinary, or transdisciplinary activities (e.g. Choi and Pak, 2006). In multidisciplinary interactions, different disciplines work together to solve particular policy (or other) problems by making use of knowledge, methods and theories drawn from their own specific disciplines (Chua and Yang, 2008). Interdisciplinarity attempts to build up new approaches to solving 'real' problems by developing methods, theories and practices that cross two or more disciplines, often resulting in the fusing of two or more disciplines (Newell, 2001). Transdisciplinarity is a third ideal type of interaction: different disciplines are 'integrated,' thereby developing new understandings of the role of knowledge in solving given policy or social problems (Lawrence and Despres, 2004).
Policy modelling is one of the fields where discussion is vivid on the role of multiple disciplines, with many observers aiming to develop the domain as an opportunity for interdisciplinary approaches to flourish, given the specific requirements of the area (information science, public administration, political science, etc.) (See Chen, Gregg and Dawes, 2007).
References:
Barrett, Brian D. 2012. “Is Interdisciplinarity Old News? a Disciplined Consideration of Interdisciplinarity.” British Journal of Sociology of Education 33 (1) (January): 97–114.
Chen, H, L Brandt, V Gregg, and Sharon S Dawes. 2007. Digital Government: E-Government Research, Case Studies, and Implementation.
Chettiparamb, A. 2007. “Interdisciplinarity: a Literature Review.” … Teaching and Learning Group.
Choi, Bernard C K, and Anita W P Pak. 2006. “Multidisciplinarity, Interdisciplinarity and Transdisciplinarity in Health Research, Services, Education and Policy: 1. Definitions, Objectives, and Evidence of Effectiveness..” Clinical & Investigative Medicine 29 (6) (December): 351–364.
Chua, Alton Y K, and Christopher C Yang. 2008. “The Shift Towards Multi-Disciplinarity in Information Science.” Journal of the American Society for Information Science and Technology 59 (13) (November): 2156–2170.
Lawrence, Roderick J, and Carole Despres. 2004. “Futures of Transdisciplinarity.” Futures 36 (4): 397–405.
Newell, William H. 2001. “A Theory of Interdisciplinary Studies.” Issues in Integrative Studies (19): 1–25.
 Dynamic Stochastic General Equilibrium Models
Theory-based macroeconomic forecasting and policy analysis has largely relied in recent years on (calibrated) Dynamic Stochastic General Equilibrium (DSGE) models. In particular, most policy relevant institutions, such as central banks or the International Monetary Fund (IMF) rely on DSGE models, for example the ‘Global Economy Model (GEM)’ used by the IMF (Bayoumi, 2004). The majority of the recent contributions in this area rely on New Keynesian monetary models.
DSGE models are based on the concept of rational representative agents whose behaviour is derived from their preferences and technologies by solving inter-temporal optimization problems. A number of different types of frictions and rigidities have been introduced in parts of the model such as the labour or the financial markets. Forecasting in this framework is typically based on Bayesian estimation of the model using time series of macroeconomic quantities such as output, consumption, investment, wages, inflation and interest rates (Smets and Wouters (2003), see Del Negro and Schorfheide (2012) for a recent survey).
Related terms: Economic Theories, Econometrics, Econometric Modelling, Forecasting, Macroeconomic Models, Macro-Simulation, Policy Modelling
References:
Bayoumi, T. (2004), GEM: A New International Macroeconomic Model, Occasional Paper 239, International Monetary Fund
Del Negro, M. and F. Schorfheide (2012), “DSGE Model-Based Forecasting”, Staff Report No. 554, Federal Reserve Bank of New York.
Smets, F. and R. Wouters (2003), “An Estimated Dynamic Stochastic General Equilibrium Model”, Journal of the European Economic Association, 1, 1123-1175
 Econometric Modelling
Econometric Modelling is a process of modelling relationship that is believed to hold between various economic variables pertaining to a particular economic phenomena under study. Econometrics is therefore based upon the development of statistical methods for estimating economic relationships, testing economic theories, and evaluating and implementing government and business policy.
Typically, econometric models are fitted using ordinary least-squares regression or maximum-likelihood estimation methods, where the regression methods are being employed. The methods relate one or more exogenous variables to each endogenous variable.
According to Greene (2003) experience has shown that three viewpoints, that of statistics, economic theory, and mathematics, is a necessary, but not by itself a sufficient, condition for a real understanding of the quantitative relations in modern economic life. It is the unification of all three that is powerful. And it is this unification that constitutes econometrics and econometric modelling.
Stock and Watson (2011) state the most common application of econometrics is the forecasting of such important macroeconomic variables as interest rates, inflation rates, and gross domestic product. While forecast of economic indicators are highly visible and are often widely published, econometric methods can be used in economic areas that have nothing to do with macroeconomic forecasting, such as the effect of school spending on student performance, the effects of political campaign expenditures on voting outcomes, the reducing class size in improvement of elementary school education, the racial discrimination in the market for home loans, the effect of smoking reduction due to cigarette taxes, etc.
Related terms: Economic Theories, Macroeconomic Models, Forecasting, Econometrics
References:
Greene, W. H. (2003) Econometric Analysis. 5th Edition. New Jersey: Prentice Hall, 2003. ISBN 0-13-066189-9.
Stock, J. H., Watson, M. W. (2011) Introduction to Econometrics. 3rd Edition. New Jersey: Prentice Hall, 2011. ISBN 978-1408-26-4331.
 Econometrics
Econometrics is the application of mathematics, statistical methods, and, more recently, computer science, to economic data and is described as the branch of economics that aims to give empirical content to economic relations (M. Hashem Pesaran, 1987). More precisely, it is "the quantitative analysis of actual economic phenomena based on the concurrent development of theory and observation, related by appropriate methods of inference."(Samuelson et al., 1954)
Econometrics is the unification of economics, mathematics, and statistics. This unification produces more than the sum of its parts. Econometrics adds empirical content to economic theory allowing theories to be tested and used for forecasting and policy evaluation.
There are recent signs of progress. Work by [Chib (1995)], [Chib and Greenberg (1995)] and others on importing into econometrics Monte Carlo integration methods like importance sampling and Markov Chain Monte Carlo methods, which were applied successfully in other fields, (see Casella and George (1992) and Tierney (1994)), has made possible complete Bayesian analyses of models for which that would have been impossible a few years ago. This approach relaxes the constraint on model complexity somewhat, so that DSGE (Dynamic stochastic general equilibrium) models that tell appealing stories about behaviour can at the same time be complex enough to fit the data about as well as the best Bayesian reduced form VAR’s (Vector Auto-Regression models). Moreover, recent works on the relation of econometric modelling and model choice to policy analysis, for example Brock et al. (2003) and Leeper and Zha, (2001), suggest models for policy evaluation in uncertain economic environments.
Related terms: Dynamic Stochastic General Equilibrium Models, Econometric Modelling
References:
M. Hashem Pesaran (1987), "Econometrics," The New Palgrave: A Dictionary of Economics, v. 2, p. 8, pp. 8-22
P. A. Samuelson, T. C. Koopmans, and J. R. N. Stone (1954), "Report of the Evaluative Committee for Econometrica," Econometrica 22(2), p. 142., pp. 141-146
Chib, S., (1995), Marginal likelihood from the Gibbs output, Journal of the American Statistical Association, 90(432), p. 1313-1321
Chib, S. and E. Greenberg, (1995), Understanding the Metropolis-Hastings algorithm, The American Statistician, 49(4), p. 327-335
Casella, G. and E. George, (1992), Explaining the Gibbs sampler, The American Statistician, 46(3), p. 167-174
Tierney, L., (1994), Markov Chains for Exploring Posterior Distributions, Annals of Statistics, 22, p. 1701-1762
Brock, W.A., Durlauf, S.N. and K.D. West, (2003), Policy evaluation in uncertain economic environments, Brookings Papers on Economic Activity (1), p. 1-67
Leeper, E. and T. Zha, (2001), Models policy interventions. Technical report, Indiana University and Federal Reserve Bank of Atlanta
 Economic Theories
Economics is defined in various ways, but scarcity is always part of definition. The most general definition of economics is perhaps: "Economics is the discipline studying the organization of economic activities in society" (Witztum, 2011). According to Slavin (1989) economics is a set of tools that enables us to use our scarce resources efficiently. The end result is the highest possible standard of living. Theory is defined as a supposition or a system of ideas intended to explain something, especially one based on general principles independent of the thing to be explained. Initial stage of any theory is the definition of the subject matters under investigation.
Economic theory is a broad concept for a well-substantiated explanation and understanding of commercial activities such as the production and consumption of goods.
The Wealth of Nations written by Adam Smith (Smith, 1776) is usually referred as the first classical economic concept or theory. Under this theory little government intervention is necessary to help support a society. Smith and his followers believed that when an individual pursues his self-interest, he indirectly promotes the good of society. In The Theory of Moral Sentiments, Smith alone wrote: “How selfish soever man may be supposed, there are evidently some principles in his nature which interest him in the fortune of others and render their happiness necessary to him though he derives nothing from it except the pleasure of seeing it” (Smith, 1759). Therefore, self-interested competition in the free market would tend to benefit society as whole.
Mainstream economic theory relies upon a priori quantitative economic models. 

References:
Slavin, S. L. (1989). Introduction to Economics. Irwin, Boston.
Smith, A. (1776). Nature and Causes of the Wealth of Nations (Wealth of Nations). Available on: http://www.econlib.org/library/Smith/smWN.html
Smith, A. (1759). The Theory of Moral Sentiment. Available on: http://www.econlib.org/library/Smith/smMS.html
Witztum, A. (2011). Introduction to economics. University of London.
 Forecasting
Forecasting in general is the process of making statements about events with future outcomes. According to  Hyndman and Athanasopoulos (2012) forecasting is about predicting the future as accurately as possible, given all of the information available, including historical data and knowledge of any future events that might impact the forecasts.
Forecasting is estimating in unknown situations. Predicting is a more general term and connotes estimating for any time series, cross-sectional, or longitudinal data (IIF, 2013).
The appropriate forecasting methods depend largely on what data are available. If there are no data available, or if the data available are not relevant to the forecasts, then qualitative forecasting methods must be used. These methods are not purely guesswork---there are well-developed structured approaches to obtaining good forecasts without using historical data. Examples of qualitative forecasting methods are informed opinion and judgment, the Delphi method, market research, scenario development, science and technology roadmapping methods, and historical life-cycle analogy (cf. e.g. Russell Bernard and Russell Bernard, 2012).
Quantitative forecasting methods are used to forecast future data as a function of past data. A forecasting method is an algorithm that provides a point forecast: a single vlaue that is a prediction of the value at a future time period (Hyndman et al., 2008). According to Carnot et al. (2005) quantitative forecasting can be applied when two conditions are satisfied: (a) numerical information about the past is available;(b) it is reasonable to assume that some aspects of the past patterns will continue into the future. Examples of qualitative forecasting methods are time series methods like Moving average, Weighted moving average, Kalman filtering, Exponential smoothing, Autoregressive (integrated) moving average (ARMA or ARIMA), Extrapolation, Linear prediction, Trend estimation; Artificial intelligence methods like data mining, machine learning and pattern recognition; or Simulation. For applications with R see Shumway and Stoffer (2011).
References:
Carnot, N., Koen, V., Tissot, B. (2005). Economic Forecasting. Palgrave MacMillan New York.
Hyndman, R. J., Athanasopoulos, G. (2012). Forecasting: principles and practice. O Texts Online, Open-Access Textbooks.
Hyndman, R. J., Koehler, A. B., Ord, J. K., Snyder, R. D. (2008). Forecasting with Exponential Smoothing - The State Space Approach. Springer-Verlag Berlin Heidelberg.
International Institute of Forecasters (2013).
Shumway, R. H., Stoffer, D. S. (2011). Time Series Analysis and Its Applications. With R Examples. Springer Science+Business Media New York.
H. Russell Bernard, Harvey Russell Bernard (2012). Social Research Methods: Qualitative and Quantitative Approaches SAGE Publications
 Formal Modelling
Formal modelling means representing a system by a formal model. Formal modelling is well defined and described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification (Monin, 2003).
A wide variety of formal models have been created for supporting policy-making approach. The work of Boer et al. (2011) discusses how the interests and field theory promoted by public administration as a stakeholder in policy argumentation directly arise from its problem solving activities. They propose a framework for public administration, based on a model for diagnosis problem solving.
Policy-makers are usually facing the problem of finding the relevant parts of the regulations with their impacts in order to change the legal texts consistently and coherently. As a solution to these demands, Szoke et al. (2011) presented a semantic enrichment formal approach for policy makers which allows to reason on ambiguousness of legal texts and infer new knowledge. Wyner et al. (2011) proposed a semantic based model: they discussed the role of semantic models and ontologies in modelling policy-making, by describing policy making such as a cyclical, multi-stage process, with several stages: evaluation, agenda setting, policy formulation, decision, implementation.
Related term: Formal Model, Formal Method
References:
Monin, J.F. (2003), Understanding Formal Methods, Springer, 2003, XVI, 276.
Boer, A., Van Engers T., and Sileno G.(2011), A Problem Solving Model for Regulatory Policy Making. In Proceedings of the Workshop on Modelling Policy-making (MPM 2011), in conjunction with The 24th International Conference on Legal Knowledge and Information Systems (JURIX 2011). Vienna (Austria), 2011.
Szoke, A., Forhecz, A., Mascar, K., Strausz, G. (2011), Linking Semantic Enrichment to Legal Documents. In Proceedings of the Workshop on Modelling Policy-making (MPM 2011), Vienna (Austria), 2011.
Wyner, A. Atkinson, K and Bench-Capon ,T. (2011), Semantic Models and Ontologies in Modelling Policy-making. In Proceedings of the Workshop on Modelling Policy-making (MPM 2011), Vienna (Austria), 2011.
 Good Governance
The Good Governance concept introduces a normative dimension that concern about the quality of governance (Santiso, 2001). Most considerations on "good governance” focused on transparency and accountability as well as citizens’ participation (Mokre, Riekmann, 2006) and associated the governance quality with the level of participation, transparency, accountability, rule of law, effectiveness and equity (OECD, 2006). According to the World Bank, Good Governance involves “the combination of transparent and accountable institutions, strong skills and competence, and a fundamental willingness to do the right thing that are enable a government to deliver services to its people efficiently” (Gisselquist, 2012). Good governance is also defined as a “competent management of a country’s resources and affairs in a manner that is open, transparent, accountable, equitable and responsive to people’s needs” (AusAID, 2000).
Related terms: Stakeholder, Stakeholder Engagement, Public Governance, Policy Governance, Governance, Democratic Governance
References:
Santiso, C. (2001), Good Governance and Aid Effectiveness: The World Bank and Conditionality, The Georgetown Public Policy Review, 7(1), 1-22.
Mokre, M. and Riekmann, S. (2006), From Good Governance to Democratic Governance? A policy review of the first wave of European governance research. EU Research in Social Sciences and Humanities.
OECD. (2006), Applying Strategic Environmental Assessment: Good Practice Guidance for Development Co-Operation. pdf
Gisselquist, R. M. (2012), Good Governance as a Concept, and Why This Matters for Development Policy, pdf
AusAID. (2000), Good Governance: Guiding principles for implementation, Available here
 Governance
Governance refers to the capacity of governing systems to co-ordinate policy and to solve public problems in a complex context (Pierre, 2000). Governance is "the sum of the many ways individuals and institutions, public and private, manage their common affairs. It is a continuing process through which conflicting or diverse interests may be accommodated and co-operative action may be taken" (Commission on Global Governance, 1995). The Governance concept implies the management of a country’s resources for development at all levels using mechanisms, processes and institutions for encouraging citizens and groups to articulate their interests, mediate their differences and exercise their legal rights and obligations (UNDP, 1997). Governance involves coordination and coherence among various actors with different objectives such as political actors and institutions, interest groups, civil society, non-governmental and transnational organizations (Pierre, 2000).
Related terms: Good Governance, Democratic Governance, IT Governance, Public Governance
References:
Commission on Global Governance (1995), Our Global Neighbourhood, New York: Oxford University Press,  p. 2
Pierre, J. (2000), Debating governance authority, steering, and democracy, Oxford University Press, USA.
UNDP. (1997), Governance for sustainable human development: A UNDP policy document, Available here
 Graph Theory
Graph theory concerns graphical models of pairwise relations between objects (e.g. Hansen, Shneiderman, Smith, 2011). Such a graph consists of nodes (or vertices), representing the objects, and edges (or lines), representing the relationships between them (West, 2000). Graphs are often represented by drawing a dot or circle for every node or vertex, and drawing an arc between two vertices if they are connected by an edge (ibid). Graphs can be directed or undirected. In directed graphs arrows are used (ibid).
These graphs can be used to represent connections between websites, decisions, but also between actors in social networks (see social network, SNA). There are many good introduction books (for example West, 2000) and applications for social media (for exampple Hansen et al., 2011)
References:
Hanneman, R. A., & Riddle, M. (2005). Introduction to Social Network Methods. from http://faculty.ucr.edu/~hanneman/nettext/Introduction_to_Social_Network_Methods.pdf
Hansen, D. L., Shneiderman, B., & Smith, M. A. (2011). Analyzing social media network swith NodeXL. Insigths from a connected world. Amsterdam: Elsevier.
West, D.B. (2000). Introduction to Graph Theory. Pearson (2nd edition).
 Hierarchic Governance
Hierarchic governance structures constitute the traditional mode of governance in a country based on hierarchy of authority or power and is characterized by centralized state control represented through a central government (Atkinson and Coleman 1989). Governments are shifting - either voluntarily or through coercion from hierarchic modes of governance to networked modes of governance (Dijk & Winters-van Beek, 2009; Kettl, 2002; Hamza , 2013). But dismissing formal hierarchies as systems of governance is too risky, for several reasons (Atkinson and Coleman 1989). Hierarchic modes of governance are consistently applied, now in combination with other modes of governance, in our political systems (Kettl & Jones, 1995). However, it is important to understand the different structures in play (networks and hierarchies) and how these are applied (Thompson, 2003).
References:
Atkinson, M. & Coleman, W., (1989), Strong States and Weak States: Sectoral Policy Networks in Advanced Capitalist Economies, British Journal of Political Science, 14(1), pp.46-67.
Dijk, J. & Winters-van Beek, A., (2009), The Perspective of Network Government: The Struggle Between Hierarchies, Markets and Networks as Modes of Governance in Contemporary Government, Innovation and the Public Sector, 14, pp.235-55.
Hamza, K., (2013), The Impact of Social Media and Network Governance on State Stability in Time of Turbulences: Egypt After 2011 Revolution, PhD Thesis. Brussels, Vrije Universiteit Brussel Institute for European Studies.
Kettl, D., (2002), The Transformation of Governance. Baltimore, MD: Johns Hopkins University Press.
Kettl, D. & Jones, B., (1995), Sharing Power: Public Governance and Private Markets. Journal of Politics, 57(1), pp.246-48.
Thompson, G., (2003), Between hierarchies and markets. The logic and limits of network forms of organization. New York: Oxford University Press.
 Hypothesis
The word hypothesis comes from the Ancient Greek ὑπόθεσις, (hupothesis) meaning "to put under" or "to suppose". A hypothesis "is a proposition, or set of propositions, set forth as an explanation for the occurrence of some specified group of phenomena, either asserted merely as a provisional conjecture to guide investigation or accepted as highly probable in the light of established facts" (Webster dictionary).
Scientists generally base scientific hypotheses on previous observations that cannot satisfactorily be explained with the available scientific theories. A scientific hypothesis is a proposed explanation of a phenomenon which still has to be rigorously tested (Wise Geek, 2012).
In common usage, in the 21st century, a hypothesis refers to a provisional idea whose merit requires evaluation. For proper evaluation, the framer of a hypothesis needs to define specifics in operational terms. A hypothesis requires work by the researcher in order to either confirm or disprove it. In due course, a confirmed hypothesis may become part of a theory or occasionally may grow to become a theory itself.
In Policy Making theories, an example of rational expectations hypothesis has been used to support some strong conclusions about economic policy making, the the Policy Ineffectiveness Proposition developed by Thomas Sargent (Sargent, T. J., 1987).
References:
Webster’s Encyclopaedic Unabridged Dictionary of the English Language, Gramercy books (1989)
Wise Geek (2012). What is the Difference between a Theory and a Hypothesis?. Wise Geek. Retrieved 17 December 2012.
Sargent, T. J. (1987). Rational expectations, The New Palgrave: A Dictionary of Economics, v. 4, pp. 76–79.
 Innovation network
Innovation, the creation of new, technologically feasible, commercially realisable products, processes and organisational structures (Schumpeter, 1912; Fagerberg, Mowery and Nelson, 2006), is the result of the continuous interactions of innovative organisations such as universities, research institutes, firms such as multi-national corporations and small-to-medium-sized enterprises, government agencies, venture capitalists and others. These organisations exchange and generate knowledge by drawing on networks of relationships (innovation networks) that are embedded in institutional frameworks on the local, regional, national and international level (Ahrweiler 2010). For innovations to emerge, agents require not only financial resources to be invested in R&D, but the ability to recombine their own with external knowledge, to design interfaces to related knowledge fields and to meet customer needs.
Because agents engaged in innovation processes are confronted with a high degree of complexity, which is related to their competitors’ behaviours, the overall knowledge development, and dynamic changes in their customer needs, it is very unlikely that single firms will master all relevant knowledge fields in isolation. Innovation networks are considered to be an organizational form of R&D, which allows for mutual knowledge exchange and cross-fertilization effects among the heterogeneous actors involved.
References:
Ahrweiler, P. (Ed.)(2010), Innovation in complex social systems, London: Routledge.
Fagerberg, J., Mowery, D. and Nelson, R.R. (2006), The Oxford Handbook of Innovation, Oxford: Oxford University Press.
Schumpeter, J. (1912) The Theory of Economic Development, Oxford: Oxford University Press.
 Linear program
According to Luptáčik (2010) it is the simplest and most widely spread model of convex programming. A linear program or linear programming problem is an optimisation problem for which we attempt to maximise or minimise a linear function of the decision variables (so called objective function), where the value of the decision variables must satisfy a set of constraints, each of which must be a linear inequality or linear equality.
A linear program is a disarmingly simple object. According to Denardo (2011) its definition entails the terms, "linear expression" and "linear constraint". For instance 3x-2.5y+2z  is a linear expression where its variables are x, y and z, and the dependence of this expression on x, y and z is linear. A linear constraint requires a linear inequality to take any of the proposed forms, or in other words, a linear constraints requires a linear expression to be less/greater than or equal to a number. A linear program either maximises or minimises a linear expression subject to finitely many linear constraints.
References:
Denardo, E. V. Linear Programming and Generalizations. A Problem-based Introduction with Spreadsheets. 1st Edition. New York: Springer Science+Business Media, 2011. ISBN 978-1-4419-6490-8.
Luptáčik, M. Mathematical Optimization and Economic Analysis. 1st Edition. New York: Springer Science+Business Media, 2010. ISBN 978-0-387-89552-9.
 Linear Programming
Linear programming describes the family of mathematical tools that are used to analyse linear programs (see definition of linear program in the glossary). The word „linear“ results from character of the objective function and the constraints and the word „programming“ results from applications in areas of planning or action scheduling.
Linear programming was first designed as planning and decision tool in setting where a central decision-maker fully in control of the various quantity variables in the system has to make consistent or optimal decision. The linear programming was developed by Kantorovich (1939) and Dantzig (1982) as a tool for optimal central decision making for primarily military purposes.
It is quite clear that the standard linear programming formulation is best suited to problems where a single decision maker optimises a central welfare function subject to technological and physical constraints. Unfortunately the standard formulation does not appear so well suited to modelling situations where many agents independently maximise their own welfare functions and jointly but inadvertently determine an outcome that can only be affected indirectly by the planner or policy maker.
References:
Dantzig, G. B. (1982). Reminiscences about the origins of linear programming, in Mathematical programming : the state of the art, Bonn, 1982 (New York, 1983), 78-86.
Kantorovich, L. V. (1939). "Mathematical Methods of Organizing and Planning Production" Management Science, Vol. 6, No. 4 (Jul., 1960), pp. 366–422.
 Macro-Simulation
Macro simulation refers to a simulation showing the effects of interventions at the macro level. In the1960s and early 1970s macro simulation was introduced to make economic analysis. Macro-simulations are quantitative models often based on control and feedback loops. They often are based on a large number of assumptions and remain at the macro level, although nowadays macro-level predictions are made by modelling at the micro level (see macro-economic models). These assumption are often critised as they proved to be wrong and backed by limited empirical evidence (e.g. Nelson & Winter, 1974). This resulted in skeptisims and a shiftt towards microsimulaton and agent-based models. In policy-making the interaction between micro and macro level has been put central. By modelling and understanding the micro-level, non-linear behavior at the macro level can be observed (Gilber & Troitzsch, 2005).
Related terms: Macroeconomic Models
References:
Gilbert, N. & K. Troitzsch (2005). Simulation and social science. Simulation for Social Scientists (2 ed.). Open University Press.

Nelson, R.R. and Winter, S.G. (1974). Neoclassical vs. Evolutionary Theories of Economic Growth: Critique and Prospectus. The Economic Journal,Vol. 84, No. 336, pp. 886-905
 Macroeconomic Models
Macroeconomic models describe the operation of the economy of a country or a region through a model who's modelling constructs and outcomes are made at the macro-level  (Yurdusev, 1993). "Macro" refers to the level of analysis. Often a distinction between the micro and macro level is made. The micro-level refers to the analysis on the level of individuals, whereas the macro-level looks over a large populations, which can be a country level. Whereas in the past these models remained at the macro level and aggregates were used to model the economy, nowadays macro-economic models are also be constructed by modelling the relationships and decisions of individual agents, which interactions results in macro-economic effects (Gilbert & Terna, 2000).
References
Yurdusev, A.N. (1993). Level of Analysis and Unit of Analysis: A Case for Distinction. Millennium: Journal of International Studies, 22(1): 77—88.
Gilbert, N. & Terna, P. (2000). How to build and use agent-based models in social science. Mind & Society, Vol 1, No. 1, pp 57-72.
 Mathematical Model
A mathematical model is an abstract model in mathematical language to describe the behaviour of a system. Eykhoff (1974) defines a mathematical model as 'a representation of the essential aspects of an existing system (or a system to be constructed) which presents knowledge of that system in usable form'.
Relationships and variables are the main elements included in mathematical models. Operators, such as algebraic operators, functions, etc, describe relationships. Variables play the role of abstracting systems parameter of interest that can be quantified. A mathematical model can even include operators without variables (King, 1999).
Mathematical models can be applied to construct dynamic systems, statistical models, differential equations, or game theoretic models. Davis (1996) provides a survey of mathematical models and formal approaches for decision and policy making. In his work, Davis shows how mathematics is indeed the foundation of modern decision and policy making, as it allows modeling complex systems, testing and evaluation, control and optimization (Davis, 1996).
Toshkov proposes a mathematical model for the timing and policy shift of implementation outcomes. Using a ‘decision making under institutional constraints’ framework, the model provides a number of hypotheses about the impact of  preferences, discretion, administrative capacity and policy-making constraints (Toshkov, 2007).
Related terms: Mathematical modelling
References:
Eykhoff,P. (1974) System Identification-Parameter and State Estimation. Wiley, New York.
Peter JB King (1999). Functions with no parameters
Aris, R. ( 1994 ). Mathematical Modelling Techniques, New York : Dover. ISBN 0-486-68131-9.
Davis, P. (1996). Mathematics and Decision Making. Mathematics Awareness Week 1996.
Toshkov, D. (2007).  A Formal Model of Policy Implementation in Multi-level Systems of Governance. PhD Thesis, 2007.
 Mathematical Modelling
The modelling is an activity, a cognitive activity in which we thing about and make models to describe how devices or objects of interest behave. We usually use words, drawings or sketches, physical models, computer programs, or mathematical formulas (or more generally said different languages). If we use language of mathematics to make models we are providing mathematical modelling.
According to Dym and Ivey (2004) mathematical modelling is a principled activity that has principles behind it and methods that can be successfully applied. As he mentions the principles are over-arching or meta-principles phrased as questions about the intentions and purposes of mathematical modelling.
Usually mathematical modelling is the use of mathematics to describe real or conceptual world phenomena, to investigate important questions about the observed world, to explain world phenomena, to test ideas, and to make predictions about the world around or inside us. Mathematical (or equation-based modelling) is an approach to describe a system with the tools of calculus, typically in terms of systems of differential or difference equations. As mentioned in Toitzsch (1998) they usually allow only for the description on one macro level. The master equation approach can be used to describe interactions between a micro and a macro level, converting assumptions about the stochastic behaviour of micro units into statements about distributions of attributes of the macro unit or units.
According to Troitzsch (2009) only few of these mathematical models have closed solutions, thus necessitating numerical treatment, and this is kind of simulation, such that more complex systems of more complex elements profit much from agent-based models whose structural validity is often better than the structural validity of mathematical models of social and economic systems. Whereas in physics mathematical models are often sufficient and sometimes the best way of describing the interaction between fields and particles, this is only very rarely the case for social systems.
Related terms: Mathematical model
References:
Dym, C. L., Ivey, E. S. (2004). Principles of Mathematical Modeling (Computer Science and Applied Mathematics). Academic Press.
Troitzsch, Klaus G. (2009), Perspectives and Challenges of Agent-Based Simulation as a Tool for Economics and Other Social Sciences, In: Proc. of the 8th Int. Conf. on Autonomous Agents and Multi-Agent Systems (AAMAS 2009). p. 35-42.
Troitzsch, Klaus G. (1998), Multilevel Process Modeling in the Social Sciences: Mathematical Analysis and Computer Simulation, In: Liebrand, Wim B.G.; Nowak, Andrzej; Hegselmann, Rainer: Computer Modeling of Social Processes, London: Sage. p. 20--36.
 Mathematical programming
Mathematical programming (MP) is the use of mathematical (optimising) models to assist in taking decisions. The term "programming" antedates computers and means preparing a schedule of activities. It is also one of Operational Research techniques.
A static mathematical program attempts to identify the maxima or minima of a function, e.g. f(x_1,...,x_n) of n real-valued variables, called objective function, in a domain identified by a set of constraints. The constraints might take the general form of inequalities, for instance g_i(x_1,...,x_n)>-b_i. It finds the best solution to the problem as modelled. If the model has been built well, this solution should translate back into the real world as a good solution to the real-world problem. If it does not, analysis of why it is no good leads to greater understanding of the real-world problem.
As well described in Exodus Systems (2013), a Mathematical programming model answers the question "What is best?" rather than "What happened?", "What if?" (simulation), "What will happen?" (forecasting) or "What would an expert do and why?" (expert system).
According to Troitzsch (1998) the mathematical programming can be divided into linear programming, quadratic or non-linear programming, and stochastic programming.
References:
Exodus Systems (2013). Why Mathematical Programming is Useful [Online] (verified on November 23, 2013).
Troitzsch, Klaus G. (1998), Multilevel Process Modeling in the Social Sciences: Mathematical Analysis and Computer Simulation, In: Liebrand, Wim B.G.; Nowak, Andrzej; Hegselmann, Rainer: Computer Modeling of Social Processes. London: Sage. p. 20--36.
 Method
According to the Oxford dictionary, method is "a particular procedure for accomplishing or approaching something, especially a systematic or established one", for example, a method for software development/maintenance. The origin of the word is Latin through Greek (methodos, μέθοδος) meaning pursuit of knowledge and derives from -meta (expressing development) and -hodos (meaning way).
Method in terms of scientific research follows different approaches, with the main dichotomy between qualitative and quantitative research methods. At its most basic level, scientific method consists of 3 main steps: Observing, Explaining, and Testing (Carey, 2011).
The data collection methods involved in the observation step of the method include surveys, opinion polls and experiments. These data are analysed using statistical methods. These methods are used for the study of scientific phenomena, and they are not always appropriate when studying social/cultural phenomena such as those related to the social/political aspects of e-participation and people's attitude towards technologically-driven social interaction through digital media. The use of qualitative methods (Silverman, 2013) is more appropriate when we want to:

- understand peoples views, opinions and emotions from their own rather than the researchers perspective
- understand process involving peoples lives
- understand social interactions among people
- identify the social political and cultural context where people operate  
The data collection methods typically used in qualitative research are interviews, observations, focus group discussions. These data are analysed using interpretive methods.
The words method and methodology (see relevant entry in the glossary) sound similar but there is a fundamental difference between them: A method is a series of steps to achieve something; Methodology is the study of the design of the different methods used towards a goal (referred to also as research design).
Related term: Methodology
References:
Carey S.S. (2011). A Beginners guide to scientific method. Wadsworth Publishing, Boston, MA.
Silverman (2013). Doing qualitative research. Sage, London, UK.
 Methodology
A research methodology is a way that a researcher uses to systematically solve a research problem. A research methodology consists of the combination of the process, methods, and tools which are used in conducting research in a certain research domain, while research methods are means of finding truth in research domains (Nunamker et al., 1990).
Bailey describes a methodology as the philosophy of the research process which “includes the assumptions and values that serve as a rationale for research and the standards or criteria the researcher uses for interpreting data and reaching conclusion’’ (Bailey, 1982, p. 261). Hence, it becomes important for a researcher to design the methodology depending on the problem s/he is currently working on.
Related term: Method
References:
Bailey, K. D. (1982), Methods of Social Research, The Free Press
Nunamaker, J.F.,Jr; Chen, M. (1990), Systems development in information systems research, System Sciences, 1990. Proceedings of the Twenty-Third Annual Hawaii International Conference on (Volume:iii)