We use cookies in order to improve the quality and usability of the HSE website. More information about the use of cookies is available here, and the regulations on processing personal data can be found here. By continuing to use the site, you hereby confirm that you have been informed of the use of cookies by the HSE website and agree with our rules for processing personal data. You may disable cookies in your browser settings.
11 Myasnitskaya St., Moscow
+7 (495) 621-28-73
issek@hse.ru
Foresight methods are usually classified using two criteria:
In terms of methods of analysis, foresight techniques can be broken down into quantitative, qualitative, and mixed ones.
Quantitative methods (big data mining, benchmarking, bibliometrics, patent analysis, modelling) are based on analysing simple phenomena with the help of mathematical models.
Qualitative methods (brainstorming, expert panels, genius forecast, in-depth interviews, goal trees, scenarios, science fiction, weak signals, and wild cards) can be used to analyse complex phenomena by formalising subjective expert knowledge.
Methods combining both these approaches are also applied in foresight studies, such as Delphi, critical technologies, surveys, technology roadmaps, STEEPV analysis, and stakeholder analysis.
Another criterion for structuring foresight techniques is the nature of data sources.
Heuristic methods rely on participants’ creative potential, expertise, and ability to generate new knowledge by interacting with each other.
Analytical methods are based on documented data, evidence, and statistics.
Any foresight project is preceded by a decision on which methods to use, and what data sources employ, while options for the future are assessed based on expert evaluation.
Big data mining
The exponential growth of the amount of information generated globally encourages researchers to ‘delegate’ some of the routine mental work to automated data collection and processing systems. Natural language processing (NLP) technologies are generally seen as the most suitable for these purposes. Semantic analysis is applied to automatically search for relevant data. The purpose of semantic analysis is to extract semantic entities of various levels of complexity from analysed texts in accordance with the specified parameters.
The HSE Institute for Statistical Studies and Economics of Knowledge has developed the iFORA intelligent text mining system based on semantic analysis methods and algorithms.
iFORA can identify and rank complex semantic concepts such as trends, technologies, and research topics. Different iFORA modules do the primary work of isolating risk factors and weak signals from unstructured textual sources, and carry out trend clustering and benchmarking.
The iFORA database comprises 500 million documents in Russian, English, and Chinese, including stakeholder market analytics, articles in industry media, academic publications, and patent data.
The main advantages of automated big data mining are a wide coverage of data sources and reduced dependence on the professional qualifications of experts in manual parameter selection and feature engineering.
A disadvantage is that data mining algorithms do not use creative approaches.ts, and significant distortion of results in the case of modelling errors.
The benchmarking method was borrowed by foresight teams from business, where it was applied as a tool for finding and making use of competitors’ successful practices and identifying one’s own weaknesses.
Benchmarking allows one to compare the national, regional, or industrial technology development level with that of the world’s leaders, assess the gap, identify the sectors with the highest innovation potential, and develop a strategy to accelerate technological development.
This method is actively applied by almost all institutions in developing economies, including authorities. The difference between benchmarking and SWOT analysis-based approaches is in the tools used to identify the challenges posed by potential competitors.
In foresight studies, benchmarking is used to assess the effectiveness of technology borrowing depending on the local conditions.
The term ‘bibliometrics’ was first used in the bibliographer Paul Otlet’s work Treatise on Documentation (1934). The author suggested using mathematical and statistical approaches in the field of scientific communication. Otlet also brilliantly predicted the future role of computers in the progress of the human civilisation.
The bibliometric method is based on quantitative and statistical analysis of publications indexed in international databases; researchers usually turn to Web of Science or Scopus.
Publication quality is assessed based on the Science Citation Index or by text mining. Machine analysis uses algorithms to determine the frequency of (co)occurrence of a given set of words in a large text database.
In-depth analysis of social networks’ bibliometric data is used in foresight studies to detect social development trends.
The use of mathematical and statistical bibliometric methods makes it possible to identify promising research areas and assess their dynamics and publication activity.
The patent analysis method involves processing an array of patent applications to identify new development areas for specific technologies and entire technological domains alike.
Two criteria are typically considered for patent indicators: inventions’ success, and market players’ economic interest in new inventions.
Patent analysis allows one to track the progress of specific research or development activities, as well as identify innovations and technological changes.
Patent analysis results present an image of the technological landscape that is several years old, but they are still useful for various forecasting purposes.
This method involves building and applying mathematical models imitating actual processes or system behaviours.
Modelling helps to understand the relationship between system components, design operational or resource strategies to improve system performance, test new concepts and systems before their practical application, and obtain necessary information without interfering with real systems.
This method’s advantages include the ability to take into account complex relationships when pursuing various objectives, and quickly estimate potential prospects when system parameters change.
The perceived disadvantages include a strong dependence on data quality and completeness, difficulties presenting results, and significant distortion of results in the case of modelling errors.
This method is applied to assess the depth and scale of the impact of events and trends which determine the future.
A matrix of important for the future factors is created to conduct cross-impact analysis.
At first experts score the nature and strength of the factors’ mutual impact on each other. Then they analyse the co-dependence of all matrix elements based on these scores, to determine and predict the direction and nature of sustainable trends.
Cross-impact analysis is effective for building scenarios when all future-relevant variables must be taken into account, and the accuracy of the forecast needs to be increased.
This method can also be applied to analyse risks and the most important factors for the goal achievement. In business, this approach helps companies to rank future threats and opportunities.
If applied jointly with the Delphi method, cross-impact analysis increases the validity and accuracy of forecasts.
Combined with other techniques, it is used as a tool to measure scenarios’ sensitivity to changes in trends or to unpredictable events, which is important in planning foresight studies.
It's a future prediction technique based on building normative scenarios, and assessing their feasibility and consequences. The essence of this method is to present a future event as if it had already happened, and then plan actions to connect this future with the present. Backcasting involves mapping a step-by-step route from the destination point in the future back to the present, and identifying steps which would lead to the desired result.
Backcasting is particularly relevant when the issue under consideration is complex and affects multiple sectors and social strata over a sufficiently long period. Usually these are challenges associated with social development, environment and technology. The method somewhat overlaps with scenario analysis, though not all scenario studies can be classified as backcasting ones.
In business, backcasting is effectively applied to develop long-term strategies.
This technique involves suggesting several options for future developments, or ‘future scenarios’. In each scenario, the future must be convincingly described, possible development paths presented and substantiated, while the total number of scenarios must remain reasonable. The term ‘scenarios’ was introduced in relation to forecasting by Herman Kahn in the course of strategic military research conducted by the RAND Corporation, which also invented the Delphi method. The scenario technique was first applied for civilian purposes in 1971, when Pierre Wack, an employee of the Dutch company Shell, managed to convince the management that the oil market was heading for inevitable turmoil. As a result of its timely decisions, Shell turned out to be best prepared for the new reality after the 1973 oil shock. The company subsequently became one of the top ten oil and gas firms in the world, while others struggled to revise their budgets and investment plans.
The scenario method is effective for predicting technological changes and supplementing studies conducted using other techniques such as SWOT analysis, brainstorming, bibliometrics, and patent analyses.
Expert interviews are the basic foresight research tool designed to provide immersion in the research topic through verbal consultations with qualified experts.
An expert interview is a structured conversation on a clearly defined topic, with industry experts, ‘opinion leaders’, or specialists acting as respondents.
The purpose of an interview is to formalise the empirical knowledge of an expert in the relevant field and obtain tacit expert knowledge. As a rule, interviews should be conducted when an expert opinion on the research object’s current and target states needs to be obtained quickly and in a concentrated form.
Interviews should mainly focus on the following topics:
This method’s disadvantages include the risk of respondents lobbying their vested interests, and the need for additional validation of expert opinions.
Expert panels are periodic surveys of a group of experts on the same topic, while the group remains unchanged throughout the project.
Experts from different fields are invited to join a group for several months to monitor changes in the area under study and to interpret the results. The experts use analytical and informational materials produced by the foresight study organisers.
Expert panels are set up to obtain expert assessments of the object of study’s state and any changes on a regular basis.
The main advantage of the expert panel technique is that experts from different fields work together as a group and remain continuously involved in the foresight process.
Expert panels are effective when used in combination with other foresight methods. This technique is considered to be basic, and is applied in almost all foresight projects.
The term ‘weak signal’ was coined in 1975 by the mathematician Igor Ansoff to refer to a subtle sign of an emerging trend. Identifying weak signals helps to make better strategic decisions.
Due to their insignificance, weak signals are easily missed among more important factors, so identifying and correctly interpreting them can determine the success or failure of strategic planning.
Approaches to identifying weak signals can be based on mathematical solutions to problems where small deviations of the initial parameters lead to significant changes in end results, as well as on expert assessments.
A disadvantage of the method is erroneous mechanistic interpretation of ‘retrospective narratives’.
The term ‘environment scanning’ was coined in 1967 by Professor Francis Aguilar in his work Scanning the Business Environment.
The purpose of the method is to generate (manually or with the help of machines) reference information from a variety of sources: newspapers, magazines, internet resources, television, conference proceedings, scientific, technological, and corporate reports, fiction, and science fiction.
The scanning process aims not only to collect information, but also to identify and structure daily ‘weak signals’. Scanning involves introducing strict requirements for search tools, data sources, and participants’ qualification.
Horizon scanning involves the collection, analysis, synthesis, and dissemination of information about the external environment. The method generally focuses on three areas:
Horizon scanning is an ongoing process, as it requires constant updating of information about ongoing changes.
This method is designed to detect low-probability, unpredictable events with potentially explosive effects. Unlike ‘weak signals’, which are signs of emerging trends and do not always lead to revolutionary changes, wild cards are associated with rare events capable of suddenly transforming the established order of things.
Regular monitoring and searching for wild cards is usually carried out by professional observers who scan the media environment to track the ‘trails’ of wild cards in publications, scientific reports, and industry conference proceedings. Special horizon scanning project websites also provide a rich information base for such monitoring.
There are two types of wild cards: events which have already taken place, and possible events that may occur in the future (so-called ‘imaginary’ wild cards). When analysing the latter, it should be taken into account that they are in fact the mental products of people who can have both good and bad intentions. The risks of self-fulfilling prophecies should also be considered.
Spotting and explaining wild cards allows organisations to pursue more resilient policies and expand their development opportunities.
The brainstorming method involves a group search for non-standard solutions. In foresight studies, it’s applied to generate ‘wild’ ideas and incredible visions of the future.
This technique combines the principles of intuitive and logical thinking and is applied in an atmosphere of co-creation and trust.
In the initial stage of brainstorming, the free association method is used. Freeing the participants’ intuition and imagination produces the best results in a single session.
The second stage involves structuring and ranking the proposed ideas, identifying the most realistic ones, and rejecting weaker ones.
The advantages of the method are that it involves all group members in the process and is an opportunity to obtain results quickly. The disadvantages include difficulties organising brainstorming sessions and a lack of guaranteed useful results.
Building a goal tree is the best-known normative forecasting technique. The method was proposed by C. West Churchman and Russell Ackoff in 1957 to schematically describe the conditions for achieving goals. It involves a multi-stage breakdown of the problem sequentially considering all conditions and goals, even minor ones. The graphical representation of goals distributed between different levels resembles an inverted tree, hence the method’s name.
A graphical decision-making model includes goals, objectives, required steps, and links between them.
This method helps to determine which development factors may emerge in the future, which resources, actions, and knowledge may be required immediately, and which will be needed at later stages.
This approach has proven useful for building scenarios.
This analytical technique involves describing the challenges and threats a corporation, region, or industry under consideration will face in the future, and identifying their strengths and weaknesses.
To identify and rank challenges and threats, experts must possess not only practical knowledge, but also intuition and imagination.
Strengths | Weaknesses |
Opportunities | Threats |
This method is often applied to develop corporate business strategies.
One of its advantages is that it requires only a small number of experts.
Situational analysis is conducted to study major international political events, and predict their consequences.
The purpose is to produce relevant forecasts based on the identified characteristics of the situation under study, consider their logical relationships and overall importance. This method can serve as a tool both for working with primary data, and producing arrays of secondary analytical and prognostic information.
Situational analysis is based on viewing global political situations in a systemic way taking into account historical factors, i.e. as integrated dynamic subsystems in the system of international relations.
As an applied analytical and forecasting method, situational analysis is used as an effective communications format between leading experts and decision makers in the foreign policy sphere. In particular, situational analysis allowed to “predict” the Iran-Iraq war (1980-1988) 10 months before it has actually started, and the bombing of Cambodia (1969-1973) 3 months before.
The main driving force of the method is the experts involved, since situational analysis implies a comprehensive consideration of the problem based on combining individual and collective assessments. The optimal size of the expert panel is 25-30 people. Involving both competent professionals from academia and practitioners is an essential aspect of setting up expert panels.
Other mandatory criteria for the expert group composition typically include an adequate organisational structure, complementarity, and mutual control of the members.
Classic situational analysis technique involves holding the main meeting for two to three days in a row. However, the most common practice is a one-day meeting with breaks, due to the high workload and fatigue of the participants.
This method involves ethical collection, processing, and analysis of data on the internal and external environment, to provide competitive advantages for the customer.
Initially, the competitive intelligence concept was associated with the military, but since the middle of the last century it has also been applied in politics, economics, and national security.
Competitive intelligence is a strategic tool to promote organisations’ innovation activity, since it allows to identify and monitor industry trends, obtain up-to-date information about advanced technologies, political, regulatory, or other changes which can affect the company/industry, and predict competitors’ behaviour.
Information sources for competitive intelligence include the internet, the company’s employees and clients, industry experts, and social networks.
The most common approach to competitive intelligence comprises four stages: planning, collection, analysis, and communication, which are constantly affected by two factors: organisational culture/awareness, and processes/structure.
Network interaction remains a key component of competitive intelligence, due to insufficient access to best practices and methods applied, since professionals tend not to advertise their involvement in this activity.
This method involves group expert discussions to explore and design the future, taking into account the risks of negative trends. Global trends as such are not the subject of these discussions, but used as basic information sources for designing local development models. Foresight workshops are an applied tool for building scenarios and technology roadmaps. Foresight workshops can be integrated with such techniques as brainstorming, stakeholder analysis, and trend analysis.
Foresight workshops’ results include expert presentations of realistic options for the future, identified attractive development prospects, and joint strategies to increase the likelihood of the desired future.
Among the method’s disadvantages are a high correlation of the foresight workshop’s conclusions with the moderator's skills in managing group dynamics and conducting a constructive discussion.
The main advantage is believed to be the emergence of a network of experts sharing a common vision of the future, and a proactive position on how to achieve it.
A group discussion of strategy and scenarios during foresight workshops helps reduce participants’ indifference or fear of the future.
The Delphi method was developed by the RAND Corporation in the 1950s to predict how technology can affect warfare. The method became associated with foresight after the Japanese government applied it in 1971 in the state Technology Foresight programme. The country’s leading scientists, entrepreneurs, financiers, and politicians were asked to fill in a questionnaire. During the foresight process, participants were allowed to receive feedback from colleagues, read the ‘group response’, and discuss and revise their answers to see how their opinion differed from the overall one. The forecast was so successful that technology foresight studies based on the Delphi method are carried out in Japan to this day.
Delphi involves conducting an anonymous, remote, two-stage survey of a large number of experts. The process is aimed at a gradual convergence of opinions and the elimination of marginal judgments. In some cases, even changing the original wording of the problem is allowed.
This technique is most often used to draft a list of science and technology achievements that would have the strongest impact on the future (with a planning horizon of up to 30 years) in areas such as basic and applied research, as well as innovative products and services based on new technologies.
This method involves identifying technologies that would make the biggest contribution to accomplishing socio-economic and innovation development objectives. This foresight technique is particularly popular in economically developed countries.
An initial list of critical technologies is drafted based on expert surveys and interviews with highly qualified experts.
In the next stage, a larger number of experts (up to 200) get involved in the foresight study; they discuss the advantages, disadvantages, and effects of introducing critical technologies in expert panels or focus groups (the horizon ranges from 5 to 10 years).
In the last stage, the final list of critical technologies is drafted and approved.
The technology roadmaps method was successfully applied in 1987 by Motorola. Since then, many corporations have used this technique to design long-term development strategies for the company, or for particular technologies.
Essentially, the method involves drawing up a step-by-step plan to achieve a desired goal. In addition to outlining the optimal route, the road map also shows alternative paths, ways of achieving the goal, and timeframes for moving from one technological node to another.
A roadmap is a step-by-step plan for moving from the current state to the desired future in development phases. Progress towards the goal is synchronised with the development of technologies, products, services, business and markets.
The main advantage of this method is forging a vision of a long-term development strategy shared by all stakeholders.
Roadmaps are the basis of Technology Sequence Analysis (TSA), which uses statistical methods to estimate timeframes for achieving intermediate goals.
This method involves matrix analysis of external factors which affect the object of study. It was borrowed from strategic management and extended for the purposes of full-fledged foresight.
STEEPV analysis allows one to assess a strategic plan’s dependence on six external environmental factors: social (S), technological (T), economic (E), environmental (E), political (P), and value (V).
These factors are considered both individually and in combination, since a change in one area can lead to changes in others. For example, technological changes lead to economic ones, while political decisions affect economic development.
STEEPV analysis involves building a strategy that takes into account the study object’s vulnerability or resistance to the above external environment factors.
To produce a reasonable forecast, not only are the strength and impact of external factors considered, but also the timeframe of their consequences.
Stakeholder analysis is applied to assess the involvement into the project members of the upper levels of the management pyramid interested in the process and its end result. This method allows to prepare recommendations for project participants on how to manage stakeholder expectations, and involve them in the foresight process to ensure practical application of its results.
The support of the project by influential stakeholders significantly increases the chances of its implementation. Therefore an important factor in stakeholder analysis is the correct assesment of their influence and roles. Influence is stakeholders’ ability to affect project participants, and solve funding problems. Their role is determined by the intellectual and administrative contribution to implementing the project.
Developing a stakeholder strategy implies creating mechanisms for involving each stakeholder in the project, and choosing an appropriate way to obtain their support.