What is Research Methodology? Definition, Types, and Examples

proposed methodology

Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of the research. Several aspects must be considered before selecting an appropriate research methodology, such as research limitations and ethical concerns that may affect your research.

The research methodology section in a scientific paper describes the different methodological choices made, such as the data collection and analysis methods, and why these choices were selected. The reasons should explain why the methods chosen are the most appropriate to answer the research question. A good research methodology also helps ensure the reliability and validity of the research findings. There are three types of research methodology—quantitative, qualitative, and mixed-method, which can be chosen based on the research objectives.

What is research methodology ?

A research methodology describes the techniques and procedures used to identify and analyze information regarding a specific research topic. It is a process by which researchers design their study so that they can achieve their objectives using the selected research instruments. It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall framework within which the research is conducted. While these points can help you understand what is research methodology, you also need to know why it is important to pick the right methodology.

Why is research methodology important?

Having a good research methodology in place has the following advantages: 3

  • Helps other researchers who may want to replicate your research; the explanations will be of benefit to them.
  • You can easily answer any questions about your research if they arise at a later stage.
  • A research methodology provides a framework and guidelines for researchers to clearly define research questions, hypotheses, and objectives.
  • It helps researchers identify the most appropriate research design, sampling technique, and data collection and analysis methods.
  • A sound research methodology helps researchers ensure that their findings are valid and reliable and free from biases and errors.
  • It also helps ensure that ethical guidelines are followed while conducting research.
  • A good research methodology helps researchers in planning their research efficiently, by ensuring optimum usage of their time and resources.

Writing the methods section of a research paper? Let Paperpal help you achieve perfection

Types of research methodology.

There are three types of research methodology based on the type of research and the data required. 1

  • Quantitative research methodology focuses on measuring and testing numerical data. This approach is good for reaching a large number of people in a short amount of time. This type of research helps in testing the causal relationships between variables, making predictions, and generalizing results to wider populations.
  • Qualitative research methodology examines the opinions, behaviors, and experiences of people. It collects and analyzes words and textual data. This research methodology requires fewer participants but is still more time consuming because the time spent per participant is quite large. This method is used in exploratory research where the research problem being investigated is not clearly defined.
  • Mixed-method research methodology uses the characteristics of both quantitative and qualitative research methodologies in the same study. This method allows researchers to validate their findings, verify if the results observed using both methods are complementary, and explain any unexpected results obtained from one method by using the other method.

What are the types of sampling designs in research methodology?

Sampling 4 is an important part of a research methodology and involves selecting a representative sample of the population to conduct the study, making statistical inferences about them, and estimating the characteristics of the whole population based on these inferences. There are two types of sampling designs in research methodology—probability and nonprobability.

  • Probability sampling

In this type of sampling design, a sample is chosen from a larger population using some form of random selection, that is, every member of the population has an equal chance of being selected. The different types of probability sampling are:

  • Systematic —sample members are chosen at regular intervals. It requires selecting a starting point for the sample and sample size determination that can be repeated at regular intervals. This type of sampling method has a predefined range; hence, it is the least time consuming.
  • Stratified —researchers divide the population into smaller groups that don’t overlap but represent the entire population. While sampling, these groups can be organized, and then a sample can be drawn from each group separately.
  • Cluster —the population is divided into clusters based on demographic parameters like age, sex, location, etc.
  • Convenience —selects participants who are most easily accessible to researchers due to geographical proximity, availability at a particular time, etc.
  • Purposive —participants are selected at the researcher’s discretion. Researchers consider the purpose of the study and the understanding of the target audience.
  • Snowball —already selected participants use their social networks to refer the researcher to other potential participants.
  • Quota —while designing the study, the researchers decide how many people with which characteristics to include as participants. The characteristics help in choosing people most likely to provide insights into the subject.

What are data collection methods?

During research, data are collected using various methods depending on the research methodology being followed and the research methods being undertaken. Both qualitative and quantitative research have different data collection methods, as listed below.

Qualitative research 5

  • One-on-one interviews: Helps the interviewers understand a respondent’s subjective opinion and experience pertaining to a specific topic or event
  • Document study/literature review/record keeping: Researchers’ review of already existing written materials such as archives, annual reports, research articles, guidelines, policy documents, etc.
  • Focus groups: Constructive discussions that usually include a small sample of about 6-10 people and a moderator, to understand the participants’ opinion on a given topic.
  • Qualitative observation : Researchers collect data using their five senses (sight, smell, touch, taste, and hearing).

Quantitative research 6

  • Sampling: The most common type is probability sampling.
  • Interviews: Commonly telephonic or done in-person.
  • Observations: Structured observations are most commonly used in quantitative research. In this method, researchers make observations about specific behaviors of individuals in a structured setting.
  • Document review: Reviewing existing research or documents to collect evidence for supporting the research.
  • Surveys and questionnaires. Surveys can be administered both online and offline depending on the requirement and sample size.

Let Paperpal help you write the perfect research methods section. Start now!

What are data analysis methods.

The data collected using the various methods for qualitative and quantitative research need to be analyzed to generate meaningful conclusions. These data analysis methods 7 also differ between quantitative and qualitative research.

Quantitative research involves a deductive method for data analysis where hypotheses are developed at the beginning of the research and precise measurement is required. The methods include statistical analysis applications to analyze numerical data and are grouped into two categories—descriptive and inferential.

Descriptive analysis is used to describe the basic features of different types of data to present it in a way that ensures the patterns become meaningful. The different types of descriptive analysis methods are:

  • Measures of frequency (count, percent, frequency)
  • Measures of central tendency (mean, median, mode)
  • Measures of dispersion or variation (range, variance, standard deviation)
  • Measure of position (percentile ranks, quartile ranks)

Inferential analysis is used to make predictions about a larger population based on the analysis of the data collected from a smaller population. This analysis is used to study the relationships between different variables. Some commonly used inferential data analysis methods are:

  • Correlation: To understand the relationship between two or more variables.
  • Cross-tabulation: Analyze the relationship between multiple variables.
  • Regression analysis: Study the impact of independent variables on the dependent variable.
  • Frequency tables: To understand the frequency of data.
  • Analysis of variance: To test the degree to which two or more variables differ in an experiment.

Qualitative research involves an inductive method for data analysis where hypotheses are developed after data collection. The methods include:

  • Content analysis: For analyzing documented information from text and images by determining the presence of certain words or concepts in texts.
  • Narrative analysis: For analyzing content obtained from sources such as interviews, field observations, and surveys. The stories and opinions shared by people are used to answer research questions.
  • Discourse analysis: For analyzing interactions with people considering the social context, that is, the lifestyle and environment, under which the interaction occurs.
  • Grounded theory: Involves hypothesis creation by data collection and analysis to explain why a phenomenon occurred.
  • Thematic analysis: To identify important themes or patterns in data and use these to address an issue.

How to choose a research methodology?

Here are some important factors to consider when choosing a research methodology: 8

  • Research objectives, aims, and questions —these would help structure the research design.
  • Review existing literature to identify any gaps in knowledge.
  • Check the statistical requirements —if data-driven or statistical results are needed then quantitative research is the best. If the research questions can be answered based on people’s opinions and perceptions, then qualitative research is most suitable.
  • Sample size —sample size can often determine the feasibility of a research methodology. For a large sample, less effort- and time-intensive methods are appropriate.
  • Constraints —constraints of time, geography, and resources can help define the appropriate methodology.

Got writer’s block? Kickstart your research paper writing with Paperpal now!

How to write a research methodology .

A research methodology should include the following components: 3,9

  • Research design —should be selected based on the research question and the data required. Common research designs include experimental, quasi-experimental, correlational, descriptive, and exploratory.
  • Research method —this can be quantitative, qualitative, or mixed-method.
  • Reason for selecting a specific methodology —explain why this methodology is the most suitable to answer your research problem.
  • Research instruments —explain the research instruments you plan to use, mainly referring to the data collection methods such as interviews, surveys, etc. Here as well, a reason should be mentioned for selecting the particular instrument.
  • Sampling —this involves selecting a representative subset of the population being studied.
  • Data collection —involves gathering data using several data collection methods, such as surveys, interviews, etc.
  • Data analysis —describe the data analysis methods you will use once you’ve collected the data.
  • Research limitations —mention any limitations you foresee while conducting your research.
  • Validity and reliability —validity helps identify the accuracy and truthfulness of the findings; reliability refers to the consistency and stability of the results over time and across different conditions.
  • Ethical considerations —research should be conducted ethically. The considerations include obtaining consent from participants, maintaining confidentiality, and addressing conflicts of interest.

Streamline Your Research Paper Writing Process with Paperpal

The methods section is a critical part of the research papers, allowing researchers to use this to understand your findings and replicate your work when pursuing their own research. However, it is usually also the most difficult section to write. This is where Paperpal can help you overcome the writer’s block and create the first draft in minutes with Paperpal Copilot, its secure generative AI feature suite.  

With Paperpal you can get research advice, write and refine your work, rephrase and verify the writing, and ensure submission readiness, all in one place. Here’s how you can use Paperpal to develop the first draft of your methods section.  

  • Generate an outline: Input some details about your research to instantly generate an outline for your methods section 
  • Develop the section: Use the outline and suggested sentence templates to expand your ideas and develop the first draft.  
  • P araph ras e and trim : Get clear, concise academic text with paraphrasing that conveys your work effectively and word reduction to fix redundancies. 
  • Choose the right words: Enhance text by choosing contextual synonyms based on how the words have been used in previously published work.  
  • Check and verify text : Make sure the generated text showcases your methods correctly, has all the right citations, and is original and authentic. .   

You can repeat this process to develop each section of your research manuscript, including the title, abstract and keywords. Ready to write your research papers faster, better, and without the stress? Sign up for Paperpal and start writing today!

Frequently Asked Questions

Q1. What are the key components of research methodology?

A1. A good research methodology has the following key components:

  • Research design
  • Data collection procedures
  • Data analysis methods
  • Ethical considerations

Q2. Why is ethical consideration important in research methodology?

A2. Ethical consideration is important in research methodology to ensure the readers of the reliability and validity of the study. Researchers must clearly mention the ethical norms and standards followed during the conduct of the research and also mention if the research has been cleared by any institutional board. The following 10 points are the important principles related to ethical considerations: 10

  • Participants should not be subjected to harm.
  • Respect for the dignity of participants should be prioritized.
  • Full consent should be obtained from participants before the study.
  • Participants’ privacy should be ensured.
  • Confidentiality of the research data should be ensured.
  • Anonymity of individuals and organizations participating in the research should be maintained.
  • The aims and objectives of the research should not be exaggerated.
  • Affiliations, sources of funding, and any possible conflicts of interest should be declared.
  • Communication in relation to the research should be honest and transparent.
  • Misleading information and biased representation of primary data findings should be avoided.

Q3. What is the difference between methodology and method?

A3. Research methodology is different from a research method, although both terms are often confused. Research methods are the tools used to gather data, while the research methodology provides a framework for how research is planned, conducted, and analyzed. The latter guides researchers in making decisions about the most appropriate methods for their research. Research methods refer to the specific techniques, procedures, and tools used by researchers to collect, analyze, and interpret data, for instance surveys, questionnaires, interviews, etc.

Research methodology is, thus, an integral part of a research study. It helps ensure that you stay on track to meet your research objectives and answer your research questions using the most appropriate data collection and analysis tools based on your research design.

Accelerate your research paper writing with Paperpal. Try for free now!

  • Research methodologies. Pfeiffer Library website. Accessed August 15, 2023. https://library.tiffin.edu/researchmethodologies/whatareresearchmethodologies
  • Types of research methodology. Eduvoice website. Accessed August 16, 2023. https://eduvoice.in/types-research-methodology/
  • The basics of research methodology: A key to quality research. Voxco. Accessed August 16, 2023. https://www.voxco.com/blog/what-is-research-methodology/
  • Sampling methods: Types with examples. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/types-of-sampling-for-social-research/
  • What is qualitative research? Methods, types, approaches, examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-qualitative-research-methods-types-examples/
  • What is quantitative research? Definition, methods, types, and examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
  • Data analysis in research: Types & methods. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/data-analysis-in-research/#Data_analysis_in_qualitative_research
  • Factors to consider while choosing the right research methodology. PhD Monster website. Accessed August 17, 2023. https://www.phdmonster.com/factors-to-consider-while-choosing-the-right-research-methodology/
  • What is research methodology? Research and writing guides. Accessed August 14, 2023. https://paperpile.com/g/what-is-research-methodology/
  • Ethical considerations. Business research methodology website. Accessed August 17, 2023. https://research-methodology.net/research-methodology/ethical-considerations/

Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.  

Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.  

Experience the future of academic writing – Sign up to Paperpal and start writing for free!  

Related Reads:

  • Dangling Modifiers and How to Avoid Them in Your Writing 
  • Webinar: How to Use Generative AI Tools Ethically in Your Academic Writing
  • Research Outlines: How to Write An Introduction Section in Minutes with Paperpal Copilot
  • How to Paraphrase Research Papers Effectively

Language and Grammar Rules for Academic Writing

Climatic vs. climactic: difference and examples, you may also like, how to use paperpal to generate emails &..., ai in education: it’s time to change the..., is it ethical to use ai-generated abstracts without..., do plagiarism checkers detect ai content, word choice problems: how to use the right..., how to avoid plagiarism when using generative ai..., what are journal guidelines on using generative ai..., types of plagiarism and 6 tips to avoid..., how to write an essay introduction (with examples)..., similarity checks: the author’s guide to plagiarism and....

  • Resources Home 🏠
  • Try SciSpace Copilot
  • Search research papers
  • Add Copilot Extension
  • Try AI Detector
  • Try Paraphraser
  • Try Citation Generator
  • April Papers
  • June Papers
  • July Papers

SciSpace Resources

Here's What You Need to Understand About Research Methodology

Deeptanshu D

Table of Contents

Research methodology involves a systematic and well-structured approach to conducting scholarly or scientific inquiries. Knowing the significance of research methodology and its different components is crucial as it serves as the basis for any study.

Typically, your research topic will start as a broad idea you want to investigate more thoroughly. Once you’ve identified a research problem and created research questions , you must choose the appropriate methodology and frameworks to address those questions effectively.

What is the definition of a research methodology?

Research methodology is the process or the way you intend to execute your study. The methodology section of a research paper outlines how you plan to conduct your study. It covers various steps such as collecting data, statistical analysis, observing participants, and other procedures involved in the research process

The methods section should give a description of the process that will convert your idea into a study. Additionally, the outcomes of your process must provide valid and reliable results resonant with the aims and objectives of your research. This thumb rule holds complete validity, no matter whether your paper has inclinations for qualitative or quantitative usage.

Studying research methods used in related studies can provide helpful insights and direction for your own research. Now easily discover papers related to your topic on SciSpace and utilize our AI research assistant, Copilot , to quickly review the methodologies applied in different papers.

Analyze and understand research methodologies faster with SciSpace Copilot

The need for a good research methodology

While deciding on your approach towards your research, the reason or factors you weighed in choosing a particular problem and formulating a research topic need to be validated and explained. A research methodology helps you do exactly that. Moreover, a good research methodology lets you build your argument to validate your research work performed through various data collection methods, analytical methods, and other essential points.

Just imagine it as a strategy documented to provide an overview of what you intend to do.

While undertaking any research writing or performing the research itself, you may get drifted in not something of much importance. In such a case, a research methodology helps you to get back to your outlined work methodology.

A research methodology helps in keeping you accountable for your work. Additionally, it can help you evaluate whether your work is in sync with your original aims and objectives or not. Besides, a good research methodology enables you to navigate your research process smoothly and swiftly while providing effective planning to achieve your desired results.

What is the basic structure of a research methodology?

Usually, you must ensure to include the following stated aspects while deciding over the basic structure of your research methodology:

1. Your research procedure

Explain what research methods you’re going to use. Whether you intend to proceed with quantitative or qualitative, or a composite of both approaches, you need to state that explicitly. The option among the three depends on your research’s aim, objectives, and scope.

2. Provide the rationality behind your chosen approach

Based on logic and reason, let your readers know why you have chosen said research methodologies. Additionally, you have to build strong arguments supporting why your chosen research method is the best way to achieve the desired outcome.

3. Explain your mechanism

The mechanism encompasses the research methods or instruments you will use to develop your research methodology. It usually refers to your data collection methods. You can use interviews, surveys, physical questionnaires, etc., of the many available mechanisms as research methodology instruments. The data collection method is determined by the type of research and whether the data is quantitative data(includes numerical data) or qualitative data (perception, morale, etc.) Moreover, you need to put logical reasoning behind choosing a particular instrument.

4. Significance of outcomes

The results will be available once you have finished experimenting. However, you should also explain how you plan to use the data to interpret the findings. This section also aids in understanding the problem from within, breaking it down into pieces, and viewing the research problem from various perspectives.

5. Reader’s advice

Anything that you feel must be explained to spread more awareness among readers and focus groups must be included and described in detail. You should not just specify your research methodology on the assumption that a reader is aware of the topic.  

All the relevant information that explains and simplifies your research paper must be included in the methodology section. If you are conducting your research in a non-traditional manner, give a logical justification and list its benefits.

6. Explain your sample space

Include information about the sample and sample space in the methodology section. The term "sample" refers to a smaller set of data that a researcher selects or chooses from a larger group of people or focus groups using a predetermined selection method. Let your readers know how you are going to distinguish between relevant and non-relevant samples. How you figured out those exact numbers to back your research methodology, i.e. the sample spacing of instruments, must be discussed thoroughly.

For example, if you are going to conduct a survey or interview, then by what procedure will you select the interviewees (or sample size in case of surveys), and how exactly will the interview or survey be conducted.

7. Challenges and limitations

This part, which is frequently assumed to be unnecessary, is actually very important. The challenges and limitations that your chosen strategy inherently possesses must be specified while you are conducting different types of research.

The importance of a good research methodology

You must have observed that all research papers, dissertations, or theses carry a chapter entirely dedicated to research methodology. This section helps maintain your credibility as a better interpreter of results rather than a manipulator.

A good research methodology always explains the procedure, data collection methods and techniques, aim, and scope of the research. In a research study, it leads to a well-organized, rationality-based approach, while the paper lacking it is often observed as messy or disorganized.

You should pay special attention to validating your chosen way towards the research methodology. This becomes extremely important in case you select an unconventional or a distinct method of execution.

Curating and developing a strong, effective research methodology can assist you in addressing a variety of situations, such as:

  • When someone tries to duplicate or expand upon your research after few years.
  • If a contradiction or conflict of facts occurs at a later time. This gives you the security you need to deal with these contradictions while still being able to defend your approach.
  • Gaining a tactical approach in getting your research completed in time. Just ensure you are using the right approach while drafting your research methodology, and it can help you achieve your desired outcomes. Additionally, it provides a better explanation and understanding of the research question itself.
  • Documenting the results so that the final outcome of the research stays as you intended it to be while starting.

Instruments you could use while writing a good research methodology

As a researcher, you must choose which tools or data collection methods that fit best in terms of the relevance of your research. This decision has to be wise.

There exists many research equipments or tools that you can use to carry out your research process. These are classified as:

a. Interviews (One-on-One or a Group)

An interview aimed to get your desired research outcomes can be undertaken in many different ways. For example, you can design your interview as structured, semi-structured, or unstructured. What sets them apart is the degree of formality in the questions. On the other hand, in a group interview, your aim should be to collect more opinions and group perceptions from the focus groups on a certain topic rather than looking out for some formal answers.

In surveys, you are in better control if you specifically draft the questions you seek the response for. For example, you may choose to include free-style questions that can be answered descriptively, or you may provide a multiple-choice type response for questions. Besides, you can also opt to choose both ways, deciding what suits your research process and purpose better.

c. Sample Groups

Similar to the group interviews, here, you can select a group of individuals and assign them a topic to discuss or freely express their opinions over that. You can simultaneously note down the answers and later draft them appropriately, deciding on the relevance of every response.

d. Observations

If your research domain is humanities or sociology, observations are the best-proven method to draw your research methodology. Of course, you can always include studying the spontaneous response of the participants towards a situation or conducting the same but in a more structured manner. A structured observation means putting the participants in a situation at a previously decided time and then studying their responses.

Of all the tools described above, it is you who should wisely choose the instruments and decide what’s the best fit for your research. You must not restrict yourself from multiple methods or a combination of a few instruments if appropriate in drafting a good research methodology.

Types of research methodology

A research methodology exists in various forms. Depending upon their approach, whether centered around words, numbers, or both, methodologies are distinguished as qualitative, quantitative, or an amalgamation of both.

1. Qualitative research methodology

When a research methodology primarily focuses on words and textual data, then it is generally referred to as qualitative research methodology. This type is usually preferred among researchers when the aim and scope of the research are mainly theoretical and explanatory.

The instruments used are observations, interviews, and sample groups. You can use this methodology if you are trying to study human behavior or response in some situations. Generally, qualitative research methodology is widely used in sociology, psychology, and other related domains.

2. Quantitative research methodology

If your research is majorly centered on data, figures, and stats, then analyzing these numerical data is often referred to as quantitative research methodology. You can use quantitative research methodology if your research requires you to validate or justify the obtained results.

In quantitative methods, surveys, tests, experiments, and evaluations of current databases can be advantageously used as instruments If your research involves testing some hypothesis, then use this methodology.

3. Amalgam methodology

As the name suggests, the amalgam methodology uses both quantitative and qualitative approaches. This methodology is used when a part of the research requires you to verify the facts and figures, whereas the other part demands you to discover the theoretical and explanatory nature of the research question.

The instruments for the amalgam methodology require you to conduct interviews and surveys, including tests and experiments. The outcome of this methodology can be insightful and valuable as it provides precise test results in line with theoretical explanations and reasoning.

The amalgam method, makes your work both factual and rational at the same time.

Final words: How to decide which is the best research methodology?

If you have kept your sincerity and awareness intact with the aims and scope of research well enough, you must have got an idea of which research methodology suits your work best.

Before deciding which research methodology answers your research question, you must invest significant time in reading and doing your homework for that. Taking references that yield relevant results should be your first approach to establishing a research methodology.

Moreover, you should never refrain from exploring other options. Before setting your work in stone, you must try all the available options as it explains why the choice of research methodology that you finally make is more appropriate than the other available options.

You should always go for a quantitative research methodology if your research requires gathering large amounts of data, figures, and statistics. This research methodology will provide you with results if your research paper involves the validation of some hypothesis.

Whereas, if  you are looking for more explanations, reasons, opinions, and public perceptions around a theory, you must use qualitative research methodology.The choice of an appropriate research methodology ultimately depends on what you want to achieve through your research.

Frequently Asked Questions (FAQs) about Research Methodology

1. how to write a research methodology.

You can always provide a separate section for research methodology where you should specify details about the methods and instruments used during the research, discussions on result analysis, including insights into the background information, and conveying the research limitations.

2. What are the types of research methodology?

There generally exists four types of research methodology i.e.

  • Observation
  • Experimental
  • Derivational

3. What is the true meaning of research methodology?

The set of techniques or procedures followed to discover and analyze the information gathered to validate or justify a research outcome is generally called Research Methodology.

4. Where lies the importance of research methodology?

Your research methodology directly reflects the validity of your research outcomes and how well-informed your research work is. Moreover, it can help future researchers cite or refer to your research if they plan to use a similar research methodology.

proposed methodology

You might also like

Consensus GPT vs. SciSpace GPT: Choose the Best GPT for Research

Consensus GPT vs. SciSpace GPT: Choose the Best GPT for Research

Sumalatha G

Literature Review and Theoretical Framework: Understanding the Differences

Nikhil Seethi

Using AI for research: A beginner’s guide

Shubham Dogra

  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • 6. The Methodology
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

The methods section describes actions taken to investigate a research problem and the rationale for the application of specific procedures or techniques used to identify, select, process, and analyze information applied to understanding the problem, thereby, allowing the reader to critically evaluate a study’s overall validity and reliability. The methodology section of a research paper answers two main questions: How was the data collected or generated? And, how was it analyzed? The writing should be direct and precise and always written in the past tense.

Kallet, Richard H. "How to Write the Methods Section of a Research Paper." Respiratory Care 49 (October 2004): 1229-1232.

Importance of a Good Methodology Section

You must explain how you obtained and analyzed your results for the following reasons:

  • Readers need to know how the data was obtained because the method you chose affects the results and, by extension, how you interpreted their significance in the discussion section of your paper.
  • Methodology is crucial for any branch of scholarship because an unreliable method produces unreliable results and, as a consequence, undermines the value of your analysis of the findings.
  • In most cases, there are a variety of different methods you can choose to investigate a research problem. The methodology section of your paper should clearly articulate the reasons why you have chosen a particular procedure or technique.
  • The reader wants to know that the data was collected or generated in a way that is consistent with accepted practice in the field of study. For example, if you are using a multiple choice questionnaire, readers need to know that it offered your respondents a reasonable range of answers to choose from.
  • The method must be appropriate to fulfilling the overall aims of the study. For example, you need to ensure that you have a large enough sample size to be able to generalize and make recommendations based upon the findings.
  • The methodology should discuss the problems that were anticipated and the steps you took to prevent them from occurring. For any problems that do arise, you must describe the ways in which they were minimized or why these problems do not impact in any meaningful way your interpretation of the findings.
  • In the social and behavioral sciences, it is important to always provide sufficient information to allow other researchers to adopt or replicate your methodology. This information is particularly important when a new method has been developed or an innovative use of an existing method is utilized.

Bem, Daryl J. Writing the Empirical Journal Article. Psychology Writing Center. University of Washington; Denscombe, Martyn. The Good Research Guide: For Small-Scale Social Research Projects . 5th edition. Buckingham, UK: Open University Press, 2014; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008.

Structure and Writing Style

I.  Groups of Research Methods

There are two main groups of research methods in the social sciences:

  • The e mpirical-analytical group approaches the study of social sciences in a similar manner that researchers study the natural sciences . This type of research focuses on objective knowledge, research questions that can be answered yes or no, and operational definitions of variables to be measured. The empirical-analytical group employs deductive reasoning that uses existing theory as a foundation for formulating hypotheses that need to be tested. This approach is focused on explanation.
  • The i nterpretative group of methods is focused on understanding phenomenon in a comprehensive, holistic way . Interpretive methods focus on analytically disclosing the meaning-making practices of human subjects [the why, how, or by what means people do what they do], while showing how those practices arrange so that it can be used to generate observable outcomes. Interpretive methods allow you to recognize your connection to the phenomena under investigation. However, the interpretative group requires careful examination of variables because it focuses more on subjective knowledge.

II.  Content

The introduction to your methodology section should begin by restating the research problem and underlying assumptions underpinning your study. This is followed by situating the methods you used to gather, analyze, and process information within the overall “tradition” of your field of study and within the particular research design you have chosen to study the problem. If the method you choose lies outside of the tradition of your field [i.e., your review of the literature demonstrates that the method is not commonly used], provide a justification for how your choice of methods specifically addresses the research problem in ways that have not been utilized in prior studies.

The remainder of your methodology section should describe the following:

  • Decisions made in selecting the data you have analyzed or, in the case of qualitative research, the subjects and research setting you have examined,
  • Tools and methods used to identify and collect information, and how you identified relevant variables,
  • The ways in which you processed the data and the procedures you used to analyze that data, and
  • The specific research tools or strategies that you utilized to study the underlying hypothesis and research questions.

In addition, an effectively written methodology section should:

  • Introduce the overall methodological approach for investigating your research problem . Is your study qualitative or quantitative or a combination of both (mixed method)? Are you going to take a special approach, such as action research, or a more neutral stance?
  • Indicate how the approach fits the overall research design . Your methods for gathering data should have a clear connection to your research problem. In other words, make sure that your methods will actually address the problem. One of the most common deficiencies found in research papers is that the proposed methodology is not suitable to achieving the stated objective of your paper.
  • Describe the specific methods of data collection you are going to use , such as, surveys, interviews, questionnaires, observation, archival research. If you are analyzing existing data, such as a data set or archival documents, describe how it was originally created or gathered and by whom. Also be sure to explain how older data is still relevant to investigating the current research problem.
  • Explain how you intend to analyze your results . Will you use statistical analysis? Will you use specific theoretical perspectives to help you analyze a text or explain observed behaviors? Describe how you plan to obtain an accurate assessment of relationships, patterns, trends, distributions, and possible contradictions found in the data.
  • Provide background and a rationale for methodologies that are unfamiliar for your readers . Very often in the social sciences, research problems and the methods for investigating them require more explanation/rationale than widely accepted rules governing the natural and physical sciences. Be clear and concise in your explanation.
  • Provide a justification for subject selection and sampling procedure . For instance, if you propose to conduct interviews, how do you intend to select the sample population? If you are analyzing texts, which texts have you chosen, and why? If you are using statistics, why is this set of data being used? If other data sources exist, explain why the data you chose is most appropriate to addressing the research problem.
  • Provide a justification for case study selection . A common method of analyzing research problems in the social sciences is to analyze specific cases. These can be a person, place, event, phenomenon, or other type of subject of analysis that are either examined as a singular topic of in-depth investigation or multiple topics of investigation studied for the purpose of comparing or contrasting findings. In either method, you should explain why a case or cases were chosen and how they specifically relate to the research problem.
  • Describe potential limitations . Are there any practical limitations that could affect your data collection? How will you attempt to control for potential confounding variables and errors? If your methodology may lead to problems you can anticipate, state this openly and show why pursuing this methodology outweighs the risk of these problems cropping up.

NOTE :   Once you have written all of the elements of the methods section, subsequent revisions should focus on how to present those elements as clearly and as logically as possibly. The description of how you prepared to study the research problem, how you gathered the data, and the protocol for analyzing the data should be organized chronologically. For clarity, when a large amount of detail must be presented, information should be presented in sub-sections according to topic. If necessary, consider using appendices for raw data.

ANOTHER NOTE : If you are conducting a qualitative analysis of a research problem , the methodology section generally requires a more elaborate description of the methods used as well as an explanation of the processes applied to gathering and analyzing of data than is generally required for studies using quantitative methods. Because you are the primary instrument for generating the data [e.g., through interviews or observations], the process for collecting that data has a significantly greater impact on producing the findings. Therefore, qualitative research requires a more detailed description of the methods used.

YET ANOTHER NOTE :   If your study involves interviews, observations, or other qualitative techniques involving human subjects , you may be required to obtain approval from the university's Office for the Protection of Research Subjects before beginning your research. This is not a common procedure for most undergraduate level student research assignments. However, i f your professor states you need approval, you must include a statement in your methods section that you received official endorsement and adequate informed consent from the office and that there was a clear assessment and minimization of risks to participants and to the university. This statement informs the reader that your study was conducted in an ethical and responsible manner. In some cases, the approval notice is included as an appendix to your paper.

III.  Problems to Avoid

Irrelevant Detail The methodology section of your paper should be thorough but concise. Do not provide any background information that does not directly help the reader understand why a particular method was chosen, how the data was gathered or obtained, and how the data was analyzed in relation to the research problem [note: analyzed, not interpreted! Save how you interpreted the findings for the discussion section]. With this in mind, the page length of your methods section will generally be less than any other section of your paper except the conclusion.

Unnecessary Explanation of Basic Procedures Remember that you are not writing a how-to guide about a particular method. You should make the assumption that readers possess a basic understanding of how to investigate the research problem on their own and, therefore, you do not have to go into great detail about specific methodological procedures. The focus should be on how you applied a method , not on the mechanics of doing a method. An exception to this rule is if you select an unconventional methodological approach; if this is the case, be sure to explain why this approach was chosen and how it enhances the overall process of discovery.

Problem Blindness It is almost a given that you will encounter problems when collecting or generating your data, or, gaps will exist in existing data or archival materials. Do not ignore these problems or pretend they did not occur. Often, documenting how you overcame obstacles can form an interesting part of the methodology. It demonstrates to the reader that you can provide a cogent rationale for the decisions you made to minimize the impact of any problems that arose.

Literature Review Just as the literature review section of your paper provides an overview of sources you have examined while researching a particular topic, the methodology section should cite any sources that informed your choice and application of a particular method [i.e., the choice of a survey should include any citations to the works you used to help construct the survey].

It’s More than Sources of Information! A description of a research study's method should not be confused with a description of the sources of information. Such a list of sources is useful in and of itself, especially if it is accompanied by an explanation about the selection and use of the sources. The description of the project's methodology complements a list of sources in that it sets forth the organization and interpretation of information emanating from those sources.

Azevedo, L.F. et al. "How to Write a Scientific Paper: Writing the Methods Section." Revista Portuguesa de Pneumologia 17 (2011): 232-238; Blair Lorrie. “Choosing a Methodology.” In Writing a Graduate Thesis or Dissertation , Teaching Writing Series. (Rotterdam: Sense Publishers 2016), pp. 49-72; Butin, Dan W. The Education Dissertation A Guide for Practitioner Scholars . Thousand Oaks, CA: Corwin, 2010; Carter, Susan. Structuring Your Research Thesis . New York: Palgrave Macmillan, 2012; Kallet, Richard H. “How to Write the Methods Section of a Research Paper.” Respiratory Care 49 (October 2004):1229-1232; Lunenburg, Frederick C. Writing a Successful Thesis or Dissertation: Tips and Strategies for Students in the Social and Behavioral Sciences . Thousand Oaks, CA: Corwin Press, 2008. Methods Section. The Writer’s Handbook. Writing Center. University of Wisconsin, Madison; Rudestam, Kjell Erik and Rae R. Newton. “The Method Chapter: Describing Your Research Plan.” In Surviving Your Dissertation: A Comprehensive Guide to Content and Process . (Thousand Oaks, Sage Publications, 2015), pp. 87-115; What is Interpretive Research. Institute of Public and International Affairs, University of Utah; Writing the Experimental Report: Methods, Results, and Discussion. The Writing Lab and The OWL. Purdue University; Methods and Materials. The Structure, Format, Content, and Style of a Journal-Style Scientific Paper. Department of Biology. Bates College.

Writing Tip

Statistical Designs and Tests? Do Not Fear Them!

Don't avoid using a quantitative approach to analyzing your research problem just because you fear the idea of applying statistical designs and tests. A qualitative approach, such as conducting interviews or content analysis of archival texts, can yield exciting new insights about a research problem, but it should not be undertaken simply because you have a disdain for running a simple regression. A well designed quantitative research study can often be accomplished in very clear and direct ways, whereas, a similar study of a qualitative nature usually requires considerable time to analyze large volumes of data and a tremendous burden to create new paths for analysis where previously no path associated with your research problem had existed.

To locate data and statistics, GO HERE .

Another Writing Tip

Knowing the Relationship Between Theories and Methods

There can be multiple meaning associated with the term "theories" and the term "methods" in social sciences research. A helpful way to delineate between them is to understand "theories" as representing different ways of characterizing the social world when you research it and "methods" as representing different ways of generating and analyzing data about that social world. Framed in this way, all empirical social sciences research involves theories and methods, whether they are stated explicitly or not. However, while theories and methods are often related, it is important that, as a researcher, you deliberately separate them in order to avoid your theories playing a disproportionate role in shaping what outcomes your chosen methods produce.

Introspectively engage in an ongoing dialectic between the application of theories and methods to help enable you to use the outcomes from your methods to interrogate and develop new theories, or ways of framing conceptually the research problem. This is how scholarship grows and branches out into new intellectual territory.

Reynolds, R. Larry. Ways of Knowing. Alternative Microeconomics . Part 1, Chapter 3. Boise State University; The Theory-Method Relationship. S-Cool Revision. United Kingdom.

Yet Another Writing Tip

Methods and the Methodology

Do not confuse the terms "methods" and "methodology." As Schneider notes, a method refers to the technical steps taken to do research . Descriptions of methods usually include defining and stating why you have chosen specific techniques to investigate a research problem, followed by an outline of the procedures you used to systematically select, gather, and process the data [remember to always save the interpretation of data for the discussion section of your paper].

The methodology refers to a discussion of the underlying reasoning why particular methods were used . This discussion includes describing the theoretical concepts that inform the choice of methods to be applied, placing the choice of methods within the more general nature of academic work, and reviewing its relevance to examining the research problem. The methodology section also includes a thorough review of the methods other scholars have used to study the topic.

Bryman, Alan. "Of Methods and Methodology." Qualitative Research in Organizations and Management: An International Journal 3 (2008): 159-168; Schneider, Florian. “What's in a Methodology: The Difference between Method, Methodology, and Theory…and How to Get the Balance Right?” PoliticsEastAsia.com. Chinese Department, University of Leiden, Netherlands.

  • << Previous: Scholarly vs. Popular Publications
  • Next: Qualitative Methods >>
  • Last Updated: Apr 5, 2024 1:38 PM
  • URL: https://libguides.usc.edu/writingguide
  • PRO Courses Guides New Tech Help Pro Expert Videos About wikiHow Pro Upgrade Sign In
  • EDIT Edit this Article
  • EXPLORE Tech Help Pro About Us Random Article Quizzes Request a New Article Community Dashboard This Or That Game Popular Categories Arts and Entertainment Artwork Books Movies Computers and Electronics Computers Phone Skills Technology Hacks Health Men's Health Mental Health Women's Health Relationships Dating Love Relationship Issues Hobbies and Crafts Crafts Drawing Games Education & Communication Communication Skills Personal Development Studying Personal Care and Style Fashion Hair Care Personal Hygiene Youth Personal Care School Stuff Dating All Categories Arts and Entertainment Finance and Business Home and Garden Relationship Quizzes Cars & Other Vehicles Food and Entertaining Personal Care and Style Sports and Fitness Computers and Electronics Health Pets and Animals Travel Education & Communication Hobbies and Crafts Philosophy and Religion Work World Family Life Holidays and Traditions Relationships Youth
  • Browse Articles
  • Learn Something New
  • Quizzes Hot
  • This Or That Game New
  • Train Your Brain
  • Explore More
  • Support wikiHow
  • About wikiHow
  • Log in / Sign up
  • Education and Communications
  • College University and Postgraduate
  • Academic Writing

How to Write Research Methodology

Last Updated: May 21, 2023 Approved

This article was co-authored by Alexander Ruiz, M.Ed. and by wikiHow staff writer, Jennifer Mueller, JD . Alexander Ruiz is an Educational Consultant and the Educational Director of Link Educational Institute, a tutoring business based in Claremont, California that provides customizable educational plans, subject and test prep tutoring, and college application consulting. With over a decade and a half of experience in the education industry, Alexander coaches students to increase their self-awareness and emotional intelligence while achieving skills and the goal of achieving skills and higher education. He holds a BA in Psychology from Florida International University and an MA in Education from Georgia Southern University. wikiHow marks an article as reader-approved once it receives enough positive feedback. In this case, several readers have written to tell us that this article was helpful to them, earning it our reader-approved status. This article has been viewed 516,480 times.

The research methodology section of any academic research paper gives you the opportunity to convince your readers that your research is useful and will contribute to your field of study. An effective research methodology is grounded in your overall approach – whether qualitative or quantitative – and adequately describes the methods you used. Justify why you chose those methods over others, then explain how those methods will provide answers to your research questions. [1] X Research source

Describing Your Methods

Step 1 Restate your research problem.

  • In your restatement, include any underlying assumptions that you're making or conditions that you're taking for granted. These assumptions will also inform the research methods you've chosen.
  • Generally, state the variables you'll test and the other conditions you're controlling or assuming are equal.

Step 2 Establish your overall methodological approach.

  • If you want to research and document measurable social trends, or evaluate the impact of a particular policy on various variables, use a quantitative approach focused on data collection and statistical analysis.
  • If you want to evaluate people's views or understanding of a particular issue, choose a more qualitative approach.
  • You can also combine the two. For example, you might look primarily at a measurable social trend, but also interview people and get their opinions on how that trend is affecting their lives.

Step 3 Define how you collected or generated data.

  • For example, if you conducted a survey, you would describe the questions included in the survey, where and how the survey was conducted (such as in person, online, over the phone), how many surveys were distributed, and how long your respondents had to complete the survey.
  • Include enough detail that your study can be replicated by others in your field, even if they may not get the same results you did. [4] X Research source

Step 4 Provide background for uncommon methods.

  • Qualitative research methods typically require more detailed explanation than quantitative methods.
  • Basic investigative procedures don't need to be explained in detail. Generally, you can assume that your readers have a general understanding of common research methods that social scientists use, such as surveys or focus groups.

Step 5 Cite any sources that contributed to your choice of methodology.

  • For example, suppose you conducted a survey and used a couple of other research papers to help construct the questions on your survey. You would mention those as contributing sources.

Justifying Your Choice of Methods

Step 1 Explain your selection criteria for data collection.

  • Describe study participants specifically, and list any inclusion or exclusion criteria you used when forming your group of participants.
  • Justify the size of your sample, if applicable, and describe how this affects whether your study can be generalized to larger populations. For example, if you conducted a survey of 30 percent of the student population of a university, you could potentially apply those results to the student body as a whole, but maybe not to students at other universities.

Step 2 Distinguish your research from any weaknesses in your methods.

  • Reading other research papers is a good way to identify potential problems that commonly arise with various methods. State whether you actually encountered any of these common problems during your research.

Step 3 Describe how you overcame obstacles.

  • If you encountered any problems as you collected data, explain clearly the steps you took to minimize the effect that problem would have on your results.

Step 4 Evaluate other methods you could have used.

  • In some cases, this may be as simple as stating that while there were numerous studies using one method, there weren't any using your method, which caused a gap in understanding of the issue.
  • For example, there may be multiple papers providing quantitative analysis of a particular social trend. However, none of these papers looked closely at how this trend was affecting the lives of people.

Connecting Your Methods to Your Research Goals

Step 1 Describe how you analyzed your results.

  • Depending on your research questions, you may be mixing quantitative and qualitative analysis – just as you could potentially use both approaches. For example, you might do a statistical analysis, and then interpret those statistics through a particular theoretical lens.

Step 2 Explain how your analysis suits your research goals.

  • For example, suppose you're researching the effect of college education on family farms in rural America. While you could do interviews of college-educated people who grew up on a family farm, that would not give you a picture of the overall effect. A quantitative approach and statistical analysis would give you a bigger picture.

Step 3 Identify how your analysis answers your research questions.

  • If in answering your research questions, your findings have raised other questions that may require further research, state these briefly.
  • You can also include here any limitations to your methods, or questions that weren't answered through your research.

Step 4 Assess whether your findings can be transferred or generalized.

  • Generalization is more typically used in quantitative research. If you have a well-designed sample, you can statistically apply your results to the larger population your sample belongs to.

Template to Write Research Methodology

proposed methodology

Community Q&A


  • Organize your methodology section chronologically, starting with how you prepared to conduct your research methods, how you gathered data, and how you analyzed that data. [13] X Research source Thanks Helpful 0 Not Helpful 0
  • Write your research methodology section in past tense, unless you're submitting the methodology section before the research described has been carried out. [14] X Research source Thanks Helpful 2 Not Helpful 0
  • Discuss your plans in detail with your advisor or supervisor before committing to a particular methodology. They can help identify possible flaws in your study. [15] X Research source Thanks Helpful 0 Not Helpful 0

proposed methodology

You Might Also Like


  • ↑ http://expertjournals.com/how-to-write-a-research-methodology-for-your-academic-article/
  • ↑ http://libguides.usc.edu/writingguide/methodology
  • ↑ https://www.skillsyouneed.com/learn/dissertation-methodology.html
  • ↑ https://uir.unisa.ac.za/bitstream/handle/10500/4245/05Chap%204_Research%20methodology%20and%20design.pdf
  • ↑ https://elc.polyu.edu.hk/FYP/html/method.htm

About This Article

Alexander Ruiz, M.Ed.

To write a research methodology, start with a section that outlines the problems or questions you'll be studying, including your hypotheses or whatever it is you're setting out to prove. Then, briefly explain why you chose to use either a qualitative or quantitative approach for your study. Next, go over when and where you conducted your research and what parameters you used to ensure you were objective. Finally, cite any sources you used to decide on the methodology for your research. To learn how to justify your choice of methods in your research methodology, scroll down! Did this summary help you? Yes No

  • Send fan mail to authors

Reader Success Stories

Prof. Dr. Ahmed Askar

Prof. Dr. Ahmed Askar

Apr 18, 2020

Did this article help you?

proposed methodology

M. Mahmood Shah Khan

Mar 17, 2020

Shimola Makondo

Shimola Makondo

Jul 20, 2019

Zain Sharif Mohammed Alnadhery

Zain Sharif Mohammed Alnadhery

Jan 7, 2019

Lundi Dukashe

Lundi Dukashe

Feb 17, 2020

Am I a Narcissist or an Empath Quiz

Featured Articles

Calculate Your Name Number in Numerology

Trending Articles

View an Eclipse

Watch Articles

Make Sticky Rice Using Regular Rice

  • Terms of Use
  • Privacy Policy
  • Do Not Sell or Share My Info
  • Not Selling Info

Get all the best how-tos!

Sign up for wikiHow's weekly email newsletter

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Dissertation

How to Write a Dissertation Proposal | A Step-by-Step Guide

Published on 14 February 2020 by Jack Caulfield . Revised on 11 November 2022.

A dissertation proposal describes the research you want to do: what it’s about, how you’ll conduct it, and why it’s worthwhile. You will probably have to write a proposal before starting your dissertation as an undergraduate or postgraduate student.

A dissertation proposal should generally include:

  • An introduction to your topic and aims
  • A literature review  of the current state of knowledge
  • An outline of your proposed methodology
  • A discussion of the possible implications of the research
  • A bibliography  of relevant sources

Dissertation proposals vary a lot in terms of length and structure, so make sure to follow any guidelines given to you by your institution, and check with your supervisor when you’re unsure.

Instantly correct all language mistakes in your text

Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.


Table of contents

Step 1: coming up with an idea, step 2: presenting your idea in the introduction, step 3: exploring related research in the literature review, step 4: describing your methodology, step 5: outlining the potential implications of your research, step 6: creating a reference list or bibliography.

Before writing your proposal, it’s important to come up with a strong idea for your dissertation.

Find an area of your field that interests you and do some preliminary reading in that area. What are the key concerns of other researchers? What do they suggest as areas for further research, and what strikes you personally as an interesting gap in the field?

Once you have an idea, consider how to narrow it down and the best way to frame it. Don’t be too ambitious or too vague – a dissertation topic needs to be specific enough to be feasible. Move from a broad field of interest to a specific niche:

  • Russian literature 19th century Russian literature The novels of Tolstoy and Dostoevsky
  • Social media Mental health effects of social media Influence of social media on young adults suffering from anxiety

Prevent plagiarism, run a free check.

Like most academic texts, a dissertation proposal begins with an introduction . This is where you introduce the topic of your research, provide some background, and most importantly, present your aim , objectives and research question(s) .

Try to dive straight into your chosen topic: What’s at stake in your research? Why is it interesting? Don’t spend too long on generalisations or grand statements:

  • Social media is the most important technological trend of the 21st century. It has changed the world and influences our lives every day.
  • Psychologists generally agree that the ubiquity of social media in the lives of young adults today has a profound impact on their mental health. However, the exact nature of this impact needs further investigation.

Once your area of research is clear, you can present more background and context. What does the reader need to know to understand your proposed questions? What’s the current state of research on this topic, and what will your dissertation contribute to the field?

If you’re including a literature review, you don’t need to go into too much detail at this point, but give the reader a general sense of the debates that you’re intervening in.

This leads you into the most important part of the introduction: your aim, objectives and research question(s) . These should be clearly identifiable and stand out from the text – for example, you could present them using bullet points or bold font.

Make sure that your research questions are specific and workable – something you can reasonably answer within the scope of your dissertation. Avoid being too broad or having too many different questions. Remember that your goal in a dissertation proposal is to convince the reader that your research is valuable and feasible:

  • Does social media harm mental health?
  • What is the impact of daily social media use on 18– to 25–year–olds suffering from general anxiety disorder?

Now that your topic is clear, it’s time to explore existing research covering similar ideas. This is important because it shows you what is missing from other research in the field and ensures that you’re not asking a question someone else has already answered.

You’ve probably already done some preliminary reading, but now that your topic is more clearly defined, you need to thoroughly analyse and evaluate the most relevant sources in your literature review .

Here you should summarise the findings of other researchers and comment on gaps and problems in their studies. There may be a lot of research to cover, so make effective use of paraphrasing to write concisely:

  • Smith and Prakash state that ‘our results indicate a 25% decrease in the incidence of mechanical failure after the new formula was applied’.
  • Smith and Prakash’s formula reduced mechanical failures by 25%.

The point is to identify findings and theories that will influence your own research, but also to highlight gaps and limitations in previous research which your dissertation can address:

  • Subsequent research has failed to replicate this result, however, suggesting a flaw in Smith and Prakash’s methods. It is likely that the failure resulted from…

Next, you’ll describe your proposed methodology : the specific things you hope to do, the structure of your research and the methods that you will use to gather and analyse data.

You should get quite specific in this section – you need to convince your supervisor that you’ve thought through your approach to the research and can realistically carry it out. This section will look quite different, and vary in length, depending on your field of study.

You may be engaged in more empirical research, focusing on data collection and discovering new information, or more theoretical research, attempting to develop a new conceptual model or add nuance to an existing one.

Dissertation research often involves both, but the content of your methodology section will vary according to how important each approach is to your dissertation.

Empirical research

Empirical research involves collecting new data and analysing it in order to answer your research questions. It can be quantitative (focused on numbers), qualitative (focused on words and meanings), or a combination of both.

With empirical research, it’s important to describe in detail how you plan to collect your data:

  • Will you use surveys ? A lab experiment ? Interviews?
  • What variables will you measure?
  • How will you select a representative sample ?
  • If other people will participate in your research, what measures will you take to ensure they are treated ethically?
  • What tools (conceptual and physical) will you use, and why?

It’s appropriate to cite other research here. When you need to justify your choice of a particular research method or tool, for example, you can cite a text describing the advantages and appropriate usage of that method.

Don’t overdo this, though; you don’t need to reiterate the whole theoretical literature, just what’s relevant to the choices you have made.

Moreover, your research will necessarily involve analysing the data after you have collected it. Though you don’t know yet what the data will look like, it’s important to know what you’re looking for and indicate what methods (e.g. statistical tests , thematic analysis ) you will use.

Theoretical research

You can also do theoretical research that doesn’t involve original data collection. In this case, your methodology section will focus more on the theory you plan to work with in your dissertation: relevant conceptual models and the approach you intend to take.

For example, a literary analysis dissertation rarely involves collecting new data, but it’s still necessary to explain the theoretical approach that will be taken to the text(s) under discussion, as well as which parts of the text(s) you will focus on:

  • This dissertation will utilise Foucault’s theory of panopticism to explore the theme of surveillance in Orwell’s 1984 and Kafka’s The Trial…

Here, you may refer to the same theorists you have already discussed in the literature review. In this case, the emphasis is placed on how you plan to use their contributions in your own research.

You’ll usually conclude your dissertation proposal with a section discussing what you expect your research to achieve.

You obviously can’t be too sure: you don’t know yet what your results and conclusions will be. Instead, you should describe the projected implications and contribution to knowledge of your dissertation.

First, consider the potential implications of your research. Will you:

  • Develop or test a theory?
  • Provide new information to governments or businesses?
  • Challenge a commonly held belief?
  • Suggest an improvement to a specific process?

Describe the intended result of your research and the theoretical or practical impact it will have:

Finally, it’s sensible to conclude by briefly restating the contribution to knowledge you hope to make: the specific question(s) you hope to answer and the gap the answer(s) will fill in existing knowledge:

Like any academic text, it’s important that your dissertation proposal effectively references all the sources you have used. You need to include a properly formatted reference list or bibliography at the end of your proposal.

Different institutions recommend different styles of referencing – commonly used styles include Harvard , Vancouver , APA , or MHRA . If your department does not have specific requirements, choose a style and apply it consistently.

A reference list includes only the sources that you cited in your proposal. A bibliography is slightly different: it can include every source you consulted in preparing the proposal, even if you didn’t mention it in the text. In the case of a dissertation proposal, a bibliography may also list relevant sources that you haven’t yet read, but that you intend to use during the research itself.

Check with your supervisor what type of bibliography or reference list you should include.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Caulfield, J. (2022, November 11). How to Write a Dissertation Proposal | A Step-by-Step Guide. Scribbr. Retrieved 2 April 2024, from https://www.scribbr.co.uk/thesis-dissertation/proposal/

Is this article helpful?

Jack Caulfield

Jack Caulfield

Other students also liked, what is a dissertation | 5 essential questions to get started, what is a literature review | guide, template, & examples, what is a research methodology | steps & tips.

Grad Coach

How To Choose Your Research Methodology

Qualitative vs quantitative vs mixed methods.

By: Derek Jansen (MBA). Expert Reviewed By: Dr Eunice Rautenbach | June 2021

Without a doubt, one of the most common questions we receive at Grad Coach is “ How do I choose the right methodology for my research? ”. It’s easy to see why – with so many options on the research design table, it’s easy to get intimidated, especially with all the complex lingo!

In this post, we’ll explain the three overarching types of research – qualitative, quantitative and mixed methods – and how you can go about choosing the best methodological approach for your research.

Overview: Choosing Your Methodology

Understanding the options – Qualitative research – Quantitative research – Mixed methods-based research

Choosing a research methodology – Nature of the research – Research area norms – Practicalities

Free Webinar: Research Methodology 101

1. Understanding the options

Before we jump into the question of how to choose a research methodology, it’s useful to take a step back to understand the three overarching types of research – qualitative , quantitative and mixed methods -based research. Each of these options takes a different methodological approach.

Qualitative research utilises data that is not numbers-based. In other words, qualitative research focuses on words , descriptions , concepts or ideas – while quantitative research makes use of numbers and statistics. Qualitative research investigates the “softer side” of things to explore and describe, while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them.

Importantly, qualitative research methods are typically used to explore and gain a deeper understanding of the complexity of a situation – to draw a rich picture . In contrast to this, quantitative methods are usually used to confirm or test hypotheses . In other words, they have distinctly different purposes. The table below highlights a few of the key differences between qualitative and quantitative research – you can learn more about the differences here.

  • Uses an inductive approach
  • Is used to build theories
  • Takes a subjective approach
  • Adopts an open and flexible approach
  • The researcher is close to the respondents
  • Interviews and focus groups are oftentimes used to collect word-based data.
  • Generally, draws on small sample sizes
  • Uses qualitative data analysis techniques (e.g. content analysis , thematic analysis , etc)
  • Uses a deductive approach
  • Is used to test theories
  • Takes an objective approach
  • Adopts a closed, highly planned approach
  • The research is disconnected from respondents
  • Surveys or laboratory equipment are often used to collect number-based data.
  • Generally, requires large sample sizes
  • Uses statistical analysis techniques to make sense of the data

Mixed methods -based research, as you’d expect, attempts to bring these two types of research together, drawing on both qualitative and quantitative data. Quite often, mixed methods-based studies will use qualitative research to explore a situation and develop a potential model of understanding (this is called a conceptual framework), and then go on to use quantitative methods to test that model empirically.

In other words, while qualitative and quantitative methods (and the philosophies that underpin them) are completely different, they are not at odds with each other. It’s not a competition of qualitative vs quantitative. On the contrary, they can be used together to develop a high-quality piece of research. Of course, this is easier said than done, so we usually recommend that first-time researchers stick to a single approach , unless the nature of their study truly warrants a mixed-methods approach.

The key takeaway here, and the reason we started by looking at the three options, is that it’s important to understand that each methodological approach has a different purpose – for example, to explore and understand situations (qualitative), to test and measure (quantitative) or to do both. They’re not simply alternative tools for the same job. 

Right – now that we’ve got that out of the way, let’s look at how you can go about choosing the right methodology for your research.

Methodology choices in research

2. How to choose a research methodology

To choose the right research methodology for your dissertation or thesis, you need to consider three important factors . Based on these three factors, you can decide on your overarching approach – qualitative, quantitative or mixed methods. Once you’ve made that decision, you can flesh out the finer details of your methodology, such as the sampling , data collection methods and analysis techniques (we discuss these separately in other posts ).

The three factors you need to consider are:

  • The nature of your research aims, objectives and research questions
  • The methodological approaches taken in the existing literature
  • Practicalities and constraints

Let’s take a look at each of these.

Factor #1: The nature of your research

As I mentioned earlier, each type of research (and therefore, research methodology), whether qualitative, quantitative or mixed, has a different purpose and helps solve a different type of question. So, it’s logical that the key deciding factor in terms of which research methodology you adopt is the nature of your research aims, objectives and research questions .

But, what types of research exist?

Broadly speaking, research can fall into one of three categories:

  • Exploratory – getting a better understanding of an issue and potentially developing a theory regarding it
  • Confirmatory – confirming a potential theory or hypothesis by testing it empirically
  • A mix of both – building a potential theory or hypothesis and then testing it

As a rule of thumb, exploratory research tends to adopt a qualitative approach , whereas confirmatory research tends to use quantitative methods . This isn’t set in stone, but it’s a very useful heuristic. Naturally then, research that combines a mix of both, or is seeking to develop a theory from the ground up and then test that theory, would utilize a mixed-methods approach.

Exploratory vs confirmatory research

Let’s look at an example in action.

If your research aims were to understand the perspectives of war veterans regarding certain political matters, you’d likely adopt a qualitative methodology, making use of interviews to collect data and one or more qualitative data analysis methods to make sense of the data.

If, on the other hand, your research aims involved testing a set of hypotheses regarding the link between political leaning and income levels, you’d likely adopt a quantitative methodology, using numbers-based data from a survey to measure the links between variables and/or constructs .

So, the first (and most important thing) thing you need to consider when deciding which methodological approach to use for your research project is the nature of your research aims , objectives and research questions. Specifically, you need to assess whether your research leans in an exploratory or confirmatory direction or involves a mix of both.

The importance of achieving solid alignment between these three factors and your methodology can’t be overstated. If they’re misaligned, you’re going to be forcing a square peg into a round hole. In other words, you’ll be using the wrong tool for the job, and your research will become a disjointed mess.

If your research is a mix of both exploratory and confirmatory, but you have a tight word count limit, you may need to consider trimming down the scope a little and focusing on one or the other. One methodology executed well has a far better chance of earning marks than a poorly executed mixed methods approach. So, don’t try to be a hero, unless there is a very strong underpinning logic.

Need a helping hand?

proposed methodology

Factor #2: The disciplinary norms

Choosing the right methodology for your research also involves looking at the approaches used by other researchers in the field, and studies with similar research aims and objectives to yours. Oftentimes, within a discipline, there is a common methodological approach (or set of approaches) used in studies. While this doesn’t mean you should follow the herd “just because”, you should at least consider these approaches and evaluate their merit within your context.

A major benefit of reviewing the research methodologies used by similar studies in your field is that you can often piggyback on the data collection techniques that other (more experienced) researchers have developed. For example, if you’re undertaking a quantitative study, you can often find tried and tested survey scales with high Cronbach’s alphas. These are usually included in the appendices of journal articles, so you don’t even have to contact the original authors. By using these, you’ll save a lot of time and ensure that your study stands on the proverbial “shoulders of giants” by using high-quality measurement instruments .

Of course, when reviewing existing literature, keep point #1 front of mind. In other words, your methodology needs to align with your research aims, objectives and questions. Don’t fall into the trap of adopting the methodological “norm” of other studies just because it’s popular. Only adopt that which is relevant to your research.

Factor #3: Practicalities

When choosing a research methodology, there will always be a tension between doing what’s theoretically best (i.e., the most scientifically rigorous research design ) and doing what’s practical , given your constraints . This is the nature of doing research and there are always trade-offs, as with anything else.

But what constraints, you ask?

When you’re evaluating your methodological options, you need to consider the following constraints:

  • Data access
  • Equipment and software
  • Your knowledge and skills

Let’s look at each of these.

Constraint #1: Data access

The first practical constraint you need to consider is your access to data . If you’re going to be undertaking primary research , you need to think critically about the sample of respondents you realistically have access to. For example, if you plan to use in-person interviews , you need to ask yourself how many people you’ll need to interview, whether they’ll be agreeable to being interviewed, where they’re located, and so on.

If you’re wanting to undertake a quantitative approach using surveys to collect data, you’ll need to consider how many responses you’ll require to achieve statistically significant results. For many statistical tests, a sample of a few hundred respondents is typically needed to develop convincing conclusions.

So, think carefully about what data you’ll need access to, how much data you’ll need and how you’ll collect it. The last thing you want is to spend a huge amount of time on your research only to find that you can’t get access to the required data.

Constraint #2: Time

The next constraint is time. If you’re undertaking research as part of a PhD, you may have a fairly open-ended time limit, but this is unlikely to be the case for undergrad and Masters-level projects. So, pay attention to your timeline, as the data collection and analysis components of different methodologies have a major impact on time requirements . Also, keep in mind that these stages of the research often take a lot longer than originally anticipated.

Another practical implication of time limits is that it will directly impact which time horizon you can use – i.e. longitudinal vs cross-sectional . For example, if you’ve got a 6-month limit for your entire research project, it’s quite unlikely that you’ll be able to adopt a longitudinal time horizon. 

Constraint #3: Money

As with so many things, money is another important constraint you’ll need to consider when deciding on your research methodology. While some research designs will cost near zero to execute, others may require a substantial budget .

Some of the costs that may arise include:

  • Software costs – e.g. survey hosting services, analysis software, etc.
  • Promotion costs – e.g. advertising a survey to attract respondents
  • Incentive costs – e.g. providing a prize or cash payment incentive to attract respondents
  • Equipment rental costs – e.g. recording equipment, lab equipment, etc.
  • Travel costs
  • Food & beverages

These are just a handful of costs that can creep into your research budget. Like most projects, the actual costs tend to be higher than the estimates, so be sure to err on the conservative side and expect the unexpected. It’s critically important that you’re honest with yourself about these costs, or you could end up getting stuck midway through your project because you’ve run out of money.

Budgeting for your research

Constraint #4: Equipment & software

Another practical consideration is the hardware and/or software you’ll need in order to undertake your research. Of course, this variable will depend on the type of data you’re collecting and analysing. For example, you may need lab equipment to analyse substances, or you may need specific analysis software to analyse statistical data. So, be sure to think about what hardware and/or software you’ll need for each potential methodological approach, and whether you have access to these.

Constraint #5: Your knowledge and skillset

The final practical constraint is a big one. Naturally, the research process involves a lot of learning and development along the way, so you will accrue knowledge and skills as you progress. However, when considering your methodological options, you should still consider your current position on the ladder.

Some of the questions you should ask yourself are:

  • Am I more of a “numbers person” or a “words person”?
  • How much do I know about the analysis methods I’ll potentially use (e.g. statistical analysis)?
  • How much do I know about the software and/or hardware that I’ll potentially use?
  • How excited am I to learn new research skills and gain new knowledge?
  • How much time do I have to learn the things I need to learn?

Answering these questions honestly will provide you with another set of criteria against which you can evaluate the research methodology options you’ve shortlisted.

So, as you can see, there is a wide range of practicalities and constraints that you need to take into account when you’re deciding on a research methodology. These practicalities create a tension between the “ideal” methodology and the methodology that you can realistically pull off. This is perfectly normal, and it’s your job to find the option that presents the best set of trade-offs.

Recap: Choosing a methodology

In this post, we’ve discussed how to go about choosing a research methodology. The three major deciding factors we looked at were:

  • Exploratory
  • Confirmatory
  • Combination
  • Research area norms
  • Hardware and software
  • Your knowledge and skillset

If you have any questions, feel free to leave a comment below. If you’d like a helping hand with your research methodology, check out our 1-on-1 research coaching service , or book a free consultation with a friendly Grad Coach.

proposed methodology

Psst… there’s more (for free)

This post is part of our dissertation mini-course, which covers everything you need to get started with your dissertation, thesis or research project. 

You Might Also Like:

Research methodology example

Very useful and informative especially for beginners


Nice article! I’m a beginner in the field of cybersecurity research. I am a Telecom and Network Engineer and Also aiming for PhD scholarship.

Margaret Mutandwa

I find the article very informative especially for my decitation it has been helpful and an eye opener.

Anna N Namwandi

Hi I am Anna ,

I am a PHD candidate in the area of cyber security, maybe we can link up

Tut Gatluak Doar

The Examples shows by you, for sure they are really direct me and others to knows and practices the Research Design and prepration.

Tshepo Ngcobo

I found the post very informative and practical.


I’m the process of constructing my research design and I want to know if the data analysis I plan to present in my thesis defense proposal possibly change especially after I gathered the data already.

Janine Grace Baldesco

Thank you so much this site is such a life saver. How I wish 1-1 coaching is available in our country but sadly it’s not.

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • Privacy Policy

Buy Me a Coffee

Research Method

Home » How To Write A Proposal – Step By Step Guide [With Template]

How To Write A Proposal – Step By Step Guide [With Template]

Table of Contents

How To Write A Proposal

How To Write A Proposal

Writing a Proposal involves several key steps to effectively communicate your ideas and intentions to a target audience. Here’s a detailed breakdown of each step:

Identify the Purpose and Audience

  • Clearly define the purpose of your proposal: What problem are you addressing, what solution are you proposing, or what goal are you aiming to achieve?
  • Identify your target audience: Who will be reading your proposal? Consider their background, interests, and any specific requirements they may have.

Conduct Research

  • Gather relevant information: Conduct thorough research to support your proposal. This may involve studying existing literature, analyzing data, or conducting surveys/interviews to gather necessary facts and evidence.
  • Understand the context: Familiarize yourself with the current situation or problem you’re addressing. Identify any relevant trends, challenges, or opportunities that may impact your proposal.

Develop an Outline

  • Create a clear and logical structure: Divide your proposal into sections or headings that will guide your readers through the content.
  • Introduction: Provide a concise overview of the problem, its significance, and the proposed solution.
  • Background/Context: Offer relevant background information and context to help the readers understand the situation.
  • Objectives/Goals: Clearly state the objectives or goals of your proposal.
  • Methodology/Approach: Describe the approach or methodology you will use to address the problem.
  • Timeline/Schedule: Present a detailed timeline or schedule outlining the key milestones or activities.
  • Budget/Resources: Specify the financial and other resources required to implement your proposal.
  • Evaluation/Success Metrics: Explain how you will measure the success or effectiveness of your proposal.
  • Conclusion: Summarize the main points and restate the benefits of your proposal.

Write the Proposal

  • Grab attention: Start with a compelling opening statement or a brief story that hooks the reader.
  • Clearly state the problem: Clearly define the problem or issue you are addressing and explain its significance.
  • Present your proposal: Introduce your proposed solution, project, or idea and explain why it is the best approach.
  • State the objectives/goals: Clearly articulate the specific objectives or goals your proposal aims to achieve.
  • Provide supporting information: Present evidence, data, or examples to support your claims and justify your proposal.
  • Explain the methodology: Describe in detail the approach, methods, or strategies you will use to implement your proposal.
  • Address potential concerns: Anticipate and address any potential objections or challenges the readers may have and provide counterarguments or mitigation strategies.
  • Recap the main points: Summarize the key points you’ve discussed in the proposal.
  • Reinforce the benefits: Emphasize the positive outcomes, benefits, or impact your proposal will have.
  • Call to action: Clearly state what action you want the readers to take, such as approving the proposal, providing funding, or collaborating with you.

Review and Revise

  • Proofread for clarity and coherence: Check for grammar, spelling, and punctuation errors.
  • Ensure a logical flow: Read through your proposal to ensure the ideas are presented in a logical order and are easy to follow.
  • Revise and refine: Fine-tune your proposal to make it concise, persuasive, and compelling.

Add Supplementary Materials

  • Attach relevant documents: Include any supporting materials that strengthen your proposal, such as research findings, charts, graphs, or testimonials.
  • Appendices: Add any additional information that might be useful but not essential to the main body of the proposal.

Formatting and Presentation

  • Follow the guidelines: Adhere to any specific formatting guidelines provided by the organization or institution to which you are submitting the proposal.
  • Use a professional tone and language: Ensure that your proposal is written in a clear, concise, and professional manner.
  • Use headings and subheadings: Organize your proposal with clear headings and subheadings to improve readability.
  • Pay attention to design: Use appropriate fonts, font sizes, and formatting styles to make your proposal visually appealing.
  • Include a cover page: Create a cover page that includes the title of your proposal, your name or organization, the date, and any other required information.

Seek Feedback

  • Share your proposal with trusted colleagues or mentors and ask for their feedback. Consider their suggestions for improvement and incorporate them into your proposal if necessary.

Finalize and Submit

  • Make any final revisions based on the feedback received.
  • Ensure that all required sections, attachments, and documentation are included.
  • Double-check for any formatting, grammar, or spelling errors.
  • Submit your proposal within the designated deadline and according to the submission guidelines provided.

Proposal Format

The format of a proposal can vary depending on the specific requirements of the organization or institution you are submitting it to. However, here is a general proposal format that you can follow:

1. Title Page:

  • Include the title of your proposal, your name or organization’s name, the date, and any other relevant information specified by the guidelines.

2. Executive Summary:

  •  Provide a concise overview of your proposal, highlighting the key points and objectives.
  • Summarize the problem, proposed solution, and anticipated benefits.
  • Keep it brief and engaging, as this section is often read first and should capture the reader’s attention.

3. Introduction:

  • State the problem or issue you are addressing and its significance.
  • Provide background information to help the reader understand the context and importance of the problem.
  • Clearly state the purpose and objectives of your proposal.

4. Problem Statement:

  • Describe the problem in detail, highlighting its impact and consequences.
  • Use data, statistics, or examples to support your claims and demonstrate the need for a solution.

5. Proposed Solution or Project Description:

  • Explain your proposed solution or project in a clear and detailed manner.
  • Describe how your solution addresses the problem and why it is the most effective approach.
  • Include information on the methods, strategies, or activities you will undertake to implement your solution.
  • Highlight any unique features, innovations, or advantages of your proposal.

6. Methodology:

  • Provide a step-by-step explanation of the methodology or approach you will use to implement your proposal.
  • Include a timeline or schedule that outlines the key milestones, tasks, and deliverables.
  • Clearly describe the resources, personnel, or expertise required for each phase of the project.

7. Evaluation and Success Metrics:

  • Explain how you will measure the success or effectiveness of your proposal.
  • Identify specific metrics, indicators, or evaluation methods that will be used.
  • Describe how you will track progress, gather feedback, and make adjustments as needed.
  • Present a detailed budget that outlines the financial resources required for your proposal.
  • Include all relevant costs, such as personnel, materials, equipment, and any other expenses.
  • Provide a justification for each item in the budget.

9. Conclusion:

  •  Summarize the main points of your proposal.
  •  Reiterate the benefits and positive outcomes of implementing your proposal.
  • Emphasize the value and impact it will have on the organization or community.

10. Appendices:

  • Include any additional supporting materials, such as research findings, charts, graphs, or testimonials.
  •  Attach any relevant documents that provide further information but are not essential to the main body of the proposal.

Proposal Template

Here’s a basic proposal template that you can use as a starting point for creating your own proposal:

Dear [Recipient’s Name],

I am writing to submit a proposal for [briefly state the purpose of the proposal and its significance]. This proposal outlines a comprehensive solution to address [describe the problem or issue] and presents an actionable plan to achieve the desired objectives.

Thank you for considering this proposal. I believe that implementing this solution will significantly contribute to [organization’s or community’s goals]. I am available to discuss the proposal in more detail at your convenience. Please feel free to contact me at [your email address or phone number].

Yours sincerely,

Note: This template is a starting point and should be customized to meet the specific requirements and guidelines provided by the organization or institution to which you are submitting the proposal.

Proposal Sample

Here’s a sample proposal to give you an idea of how it could be structured and written:

Subject : Proposal for Implementation of Environmental Education Program

I am pleased to submit this proposal for your consideration, outlining a comprehensive plan for the implementation of an Environmental Education Program. This program aims to address the critical need for environmental awareness and education among the community, with the objective of fostering a sense of responsibility and sustainability.

Executive Summary: Our proposed Environmental Education Program is designed to provide engaging and interactive educational opportunities for individuals of all ages. By combining classroom learning, hands-on activities, and community engagement, we aim to create a long-lasting impact on environmental conservation practices and attitudes.

Introduction: The state of our environment is facing significant challenges, including climate change, habitat loss, and pollution. It is essential to equip individuals with the knowledge and skills to understand these issues and take action. This proposal seeks to bridge the gap in environmental education and inspire a sense of environmental stewardship among the community.

Problem Statement: The lack of environmental education programs has resulted in limited awareness and understanding of environmental issues. As a result, individuals are less likely to adopt sustainable practices or actively contribute to conservation efforts. Our program aims to address this gap and empower individuals to become environmentally conscious and responsible citizens.

Proposed Solution or Project Description: Our Environmental Education Program will comprise a range of activities, including workshops, field trips, and community initiatives. We will collaborate with local schools, community centers, and environmental organizations to ensure broad participation and maximum impact. By incorporating interactive learning experiences, such as nature walks, recycling drives, and eco-craft sessions, we aim to make environmental education engaging and enjoyable.

Methodology: Our program will be structured into modules that cover key environmental themes, such as biodiversity, climate change, waste management, and sustainable living. Each module will include a mix of classroom sessions, hands-on activities, and practical field experiences. We will also leverage technology, such as educational apps and online resources, to enhance learning outcomes.

Evaluation and Success Metrics: We will employ a combination of quantitative and qualitative measures to evaluate the effectiveness of the program. Pre- and post-assessments will gauge knowledge gain, while surveys and feedback forms will assess participant satisfaction and behavior change. We will also track the number of community engagement activities and the adoption of sustainable practices as indicators of success.

Budget: Please find attached a detailed budget breakdown for the implementation of the Environmental Education Program. The budget covers personnel costs, materials and supplies, transportation, and outreach expenses. We have ensured cost-effectiveness while maintaining the quality and impact of the program.

Conclusion: By implementing this Environmental Education Program, we have the opportunity to make a significant difference in our community’s environmental consciousness and practices. We are confident that this program will foster a generation of individuals who are passionate about protecting our environment and taking sustainable actions. We look forward to discussing the proposal further and working together to make a positive impact.

Thank you for your time and consideration. Should you have any questions or require additional information, please do not hesitate to contact me at [your email address or phone number].

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Grant Proposal

Grant Proposal – Example, Template and Guide

How To Write A Business Proposal

How To Write A Business Proposal – Step-by-Step...

Business Proposal

Business Proposal – Templates, Examples and Guide

How To Write a Research Proposal

How To Write A Research Proposal – Step-by-Step...


Proposal – Types, Examples, and Writing Guide

How to choose an Appropriate Method for Research?

How to choose an Appropriate Method for Research?

  • Open access
  • Published: 10 January 2024

A scoping review of theories, models and frameworks used or proposed to evaluate knowledge mobilization strategies

  • Saliha Ziam   ORCID: orcid.org/0000-0002-8892-9572 1 ,
  • Sèverine Lanoue 2 ,
  • Esther McSween-Cadieux 2 ,
  • Mathieu-Joël Gervais 3 ,
  • Julie Lane 2 , 4 ,
  • Dina Gaid 5 ,
  • Laura Justine Chouinard 1 ,
  • Christian Dagenais 6 ,
  • Valéry Ridde 7 , 8 ,
  • Emmanuelle Jean 9 ,
  • France Charles Fleury 10 ,
  • Quan Nha Hong 5 &
  • Ollivier Prigent 2  

Health Research Policy and Systems volume  22 , Article number:  8 ( 2024 ) Cite this article

2338 Accesses

6 Altmetric

Metrics details

Evaluating knowledge mobilization strategies (KMb) presents challenges for organizations seeking to understand their impact to improve KMb effectiveness. Moreover, the large number of theories, models, and frameworks (TMFs) available can be confusing for users. Therefore, the purpose of this scoping review was to identify and describe the characteristics of TMFs that have been used or proposed in the literature to evaluate KMb strategies.

A scoping review methodology was used. Articles were identified through searches in electronic databases, previous reviews and reference lists of included articles. Titles, abstracts and full texts were screened in duplicate. Data were charted using a piloted data charting form. Data extracted included study characteristics, KMb characteristics, and TMFs used or proposed for KMb evaluation. An adapted version of Nilsen (Implement Sci 10:53, 2015) taxonomy and the Expert Recommendations for Implementing Change (ERIC) taxonomy (Powell et al. in Implement Sci 10:21, 2015) guided data synthesis.

Of the 4763 search results, 505 were retrieved, and 88 articles were eligible for review. These consisted of 40 theoretical articles (45.5%), 44 empirical studies (50.0%) and four protocols (4.5%). The majority were published after 2010 ( n  = 70, 79.5%) and were health related ( n  = 71, 80.7%). Half of the studied KMb strategies were implemented in only four countries: Canada, Australia, the United States and the United Kingdom ( n  = 42, 47.7%). One-third used existing TMFs ( n  = 28, 31.8%). According to the adapted Nilsen taxonomy, process models ( n  = 34, 38.6%) and evaluation frameworks ( n  = 28, 31.8%) were the two most frequent types of TMFs used or proposed to evaluate KMb. According to the ERIC taxonomy, activities to “train and educate stakeholders” ( n  = 46, 52.3%) were the most common, followed by activities to “develop stakeholder interrelationships” ( n  = 23, 26.1%). Analysis of the TMFs identified revealed relevant factors of interest for the evaluation of KMb strategies, classified into four dimensions: context, process, effects and impacts.


This scoping review provides an overview of the many KMb TMFs used or proposed. The results provide insight into potential dimensions and components to be considered when assessing KMb strategies.

Peer Review reports

Contribution to the literature

The evaluation of KMb strategies is a critical dimension of the KMb process that is still poorly documented and warrants researchers’ attention.

Our review identified the most common theories, models and frameworks (TMFs) proposed or used to assess KMb strategies and the main components to consider when evaluating a KMb strategy.

By developing an integrative reference framework, this work contributes to improving organizations’ capacity to evaluate their KMb initiatives.

It is widely recognized that research evidence has the potential to inform, guide, and improve practices, decisions, and policies [ 1 ]. Unfortunately, for diverse reasons, the best available evidence is still too seldom taken into account and used [ 2 , 3 , 4 , 5 , 6 , 7 ]. The field of research on knowledge mobilization (KMb) has been growing rapidly since the early 2000s [ 2 , 3 , 8 , 9 , 10 , 11 ]. Its purpose is to better understand how to effectively promote and support evidence use.

Knowledge mobilization is one of many terms and concepts developed over recent decades to describe processes, strategies, and actions to bridge the gap between research and practice. Other common terms often paired interchangeably with the term “knowledge” are “translation”, “transfer”, “exchange”, “sharing” and “dissemination”, among others. [ 12 , 13 ]. Some are more closely linked than others to specific fields or jurisdictions. For this study, we adopted the term knowledge mobilization (KMb) because it conveys the notions of complexity and multidirectional exchanges that characterize research-to-action processes. We used it as an umbrella concept that encompasses the efforts made to translate knowledge into concrete actions and beneficial impacts on populations [ 1 ]. Moreover, the term KMb is also used by research funding agencies in Canada to emphasize the medium- and long-term effects that research knowledge or research results can have on potential users [ 1 , 14 ].

KMb represents all processes from knowledge creation to action and includes all strategies implemented to facilitate these processes [ 14 ]. A KMb strategy is understood as a coordinated set of activities to support evidence use, such as dissemination activities to reach target audiences (for example, educational materials, practical guides, decision support tools) or activities to facilitate knowledge application in a specific context and support professional behaviour change (for example, community of practice, educational meetings, audits and feedback, reminders, deliberative dialogues) [ 15 ]. A KMb process may vary in intensity, complexity or actor engagement depending on the nature of the research knowledge and the needs and preferences of evidence users [ 7 ].

KMb is considered a complex process, in that numerous factors can facilitate or hinder its implementation and subsequent evidence use. The past two decades have seen the emergence of a deeper understanding of these factors [ 2 , 3 , 16 ]. These may be related to the knowledge mobilized (for example, relevance, reliability, clarity, costs), the individuals involved in the KMb process (for example, openness to change, values, time available, resources), the KMB strategies (for example, fit with stakeholder needs and preferences, regular interactions, trust relationships, timing), and organizational and political contexts (for example, culture of evidence use, leadership, resources) [ 2 , 6 , 17 , 18 ]. However, more studies are needed to understand which factors are more important in which contexts, and to evaluate the effects of KMb strategies.

On this last point, while essential, it is often very complex to study KMb impacts empirically to demonstrate the effectiveness of KMb strategies [ 19 , 20 , 21 ]. Partly for this reason, high-quality studies that evaluate process, mechanisms and effects of KMb strategies are still relatively rare [ 2 , 22 , 23 , 24 , 25 ]. As a result, knowledge about the effectiveness of different KMb strategies remains limited [ 10 , 17 , 19 , 23 , 26 , 27 , 28 ] and their development cannot be totally evidence informed [ 3 , 19 , 20 , 23 , 29 , 30 ], which may seem incompatible with the core values and principles of KMb.

The growing interest in KMb has led to an impressive proliferation of conceptual propositions, such as theories, models and frameworks (TMF) [ 2 , 3 , 9 , 11 , 12 , 31 , 32 ]. Many deplore the fact that these are poorly used [ 11 , 30 , 33 ] and insist on the need to test, refine and integrate existing ones [ 3 , 31 , 34 ]. Indeed, the conceptual and theoretical development of the field has outpaced its empirical development. This proliferation appears to have created confusion among certain users, such as organizations that need to evaluate their KMb strategies. Besides implementing and funding KMb strategies, knowledge organizations such as granting agencies, governments and public organizations, universities and health authorities are often required to demonstrate the impact of their strategies [ 21 , 35 , 36 ]. Yet this can be a significant challenge [ 20 , 23 , 29 ]. They may have difficulty knowing which TMFs to choose, in what context and how to use them effectively in their evaluation process [ 12 , 37 ].

Indeed, the evaluation of KMb strategies is still relatively poorly documented, with respect to the phases of their development and implementation. Our aim in this scoping review is to clarify, conceptually and methodologically, this crucial dimension of the KMb process. This would help organizations gain access to evidence-based, operational and easy-to-use evaluation toolkits for assessing the impacts of their KMb strategies.

To survey the available knowledge on evaluation practices for KMb strategies, we conducted a scoping review. According to Munn et al. [ 38 ], a scoping review is indicated to identify the types of available evidence and knowledge gaps, to clarify concepts in the literature and to identify key characteristics or factors related to a concept. This review methodology also allows for the inclusion of a diversity of publications, regardless of their nature or research design, to produce the most comprehensive evidence mapping possible [ 39 ]. The objective of the scoping review was to identify and describe the characteristics of theories, models and frameworks (TMFs) used or proposed to evaluate KMb strategies. The specific research questions were:

What TMFs to evaluate KMb strategies exist in the literature?

What KMb strategies do they evaluate (that is types of KMb objectives, activities, target audiences)?

What dimensions and components are included in these TMFs?

This scoping review was conducted based on the five steps outlined by Arksey and O’Malley [ 39 ]: (1) formulating the research questions; (2) identifying relevant studies; (3) selecting relevant studies; (4) extracting and charting data; and (5) analysing, collating, summarizing and presenting the data. Throughout the process, researchers and knowledge users (KMb practitioners) were involved in decisions regarding the research question, search strategy, selection criteria for studies and categories for data charting. We followed the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines [ 40 ]. No protocol was registered for this review.

Search strategy and information sources

The search strategy was developed, piloted and refined in consultation with our team’s librarian. Search terms included controlled vocabulary and keywords related to three main concepts: (1) knowledge mobilization (for example [knowledge or evidence or research] and transfer, translation, diffusion, dissemination, mobilization, implementation science, exchange, sharing, use, uptake, evidence-based practice, research-based evidence), (2) evaluation (for example, evaluat*, measur*, impact, outcome, assess, apprais*, indicator) and (3) TMF (for example, framework*, model*, method*, guide*, theor*). See Additional file 1 for the search terms and strategies used in the electronic searches.

The following databases were searched from January 2000 to August 2023: MEDLINE (Ovid), PsycInfo (Ovid), ERIC (ProQuest), Sociological Abstracts (ProQuest), Dissertations & Theses (Proquest), Érudit and Cairn. These databases were chosen to identify relevant references in the health, education and social fields. Several search strategies were tested by the librarian to optimize the retrieval of citations known to the investigators and to increase the likelihood that all relevant studies would be retrieved. We also searched reference lists of included articles and previous systematic reviews [ 11 , 12 , 15 , 41 ].

Eligibility criteria

A publication was considered eligible if it (1) presented or used a theory, model, or framework (TMF), (2) described dimensions or specific components to consider in the evaluation of KMb strategies, (3) presented or discussed KMb strategies or activities (any initiatives to improve evidence use), and (4) proposed outcomes that might result directly or indirectly from the KMb strategies. Studies were excluded from analysis if they (1) presented a TMF to assess the impact of research without mentioning KMb strategies or an intervention not related to KMb and (2) presented evaluation dimensions or components that could not be generalized. We considered publications in English or French. All types of articles and study designs were eligible, including study protocols.

Study selection

The results of the literature search were imported into Covidence, which the review team used for screening. After duplicate articles were removed, the titles and abstracts were screened independently by two of the three reviewers (EMC, MJG, GL). Publications identified as potentially relevant were retrieved in full text and screened independently by three reviewers (EMC, MJG, GL). Discrepancies regarding the inclusion of any publication were resolved through discussion and consensus among reviewers. The principal investigator (SZ) validated the final selection of articles.

Data synthesis

A data charting form was developed in Microsoft Excel and piloted by the research team. Data extracted included study characteristics (authors, authors’ country of affiliation, year, journal, discipline, article type, study setting, study aim), KMb strategies of interest, KMb objectives, KMb target audiences and TMFs used or proposed for KMb evaluation (existing or new TMF, specific dimensions or components of TMF and so on). Data were extracted by a single reviewer (SL, JC or OP) and validated by a second reviewer (SZ). Disagreements were discussed between reviewers and resolved by consensus. No quality appraisal of included studies was conducted, as this is optional in scoping reviews and the purpose was only to describe the content of identified TMFs [ 42 ].

Data analysis and presentation of results

Data were summarized according to study characteristics, KMb strategy characteristics (activities, objectives, target audiences), types of TMFs, and dimensions or components to consider for KMb evaluation. Disagreements during the process were discussed and resolved through consensus (SL, DG, SZ). A KMb strategy might have one or more objectives and include one or more activities. Thus, the objectives and activities of the KMb strategies extracted from the selected studies were summarized based on existing categorizations. The categorization of KMb objectives was inspired by Gervais et al. [ 15 ] and Farkas et al. [ 43 ] (Table  1 ).

The KMb activities were categorized according to the Expert Recommendations for Implementing Change (ERIC) taxonomy [ 44 ]. The activities were first classified according to the full taxonomy and then grouped into the nine categories proposed by Waltz et al. [ 45 ] (Table  2 ).

The TMFs were categorized according to the categories of theoretical approaches described by Nilsen [ 32 ]: process models, evaluation frameworks, determinant frameworks and classic theories (Table  3 ). The category “implementation theories” originally described by Nilsen [ 32 ] was not used because we did not identify any article that fit this category. We also added a category named “logic models” due to the nature of the identified TMFs. Logic models are often used in theory-driven evaluation approaches and are usually developed to show the links among inputs (resources), activities and outputs (outcomes and short-, medium- and long-term effects) [ 46 ].

Finally, the content extracted from the TMFs was analysed using mainly an inductive method. This method allows, among other things, to develop a reference framework or a model from the emerging categories that are evident in the text data [ 50 ].

The classification of concepts is the result of multiple readings and interpretations. The concepts associated with each dimension of the framework were classified according to their meaning. Similar concepts were grouped together to form components. These grouped components were then associated with the subdimensions and main dimensions of the framework.

Search results

The searches yielded 4763 articles. Of those, 4258 were excluded during the title and abstract screening. Of the 505 full-text articles, we retained 88 in our final sample. The results of the search and selection processes (PRISMA flowchart) are summarized in Fig.  1 .

figure 1

PRISMA flowchart summarizing search strategy and selection results [ 40 ]

Publication characteristics

Most articles were published after 2010 ( n  = 70, 79.5%), with an average of 5 articles per year between 2010 and 2023 compared with an average of 2.1 articles per year between 2001 and 2009; there were no eligible articles from 2000. The search was conducted in August 2023, and only five articles were published in these 7 months of the year. Table 4 presents the main characteristics of the selected articles. A full list of the included articles with their main characteristics is presented in Additional file 2 .

The number of theoretical and empirical articles was relatively similar. Among the theoretical articles, 19 descriptive articles (21.6%) were aimed at describing a KMb strategy, a KMb infrastructure or a TMF related to a specific programme or context; 18 articles (20.5%) synthesized knowledge to propose a TMF (new or revised); and three articles conducted systematic reviews (3.4%).

The empirical articles category included studies with different methodological approaches (quantitative, qualitative, mixed methods). We will not report the details of the methodologies used, as this would result in a long list with few occurrences. The empirical articles can be divided into three categories: (1) studies that evaluated a TMF related to KMb ( n  = 16, 18.2%), (2) studies that evaluated a KMb strategy ( n  = 21, 23.9%) and (3) studies that evaluated both a KMb strategy and a TMF ( n  = 7, 8.0%).

Most articles were related to healthcare ( n  = 71, 80.7%). This field of study was divided into three subdomains. The healthcare and social services articles usually described or assessed a KMb strategy targeting health professionals’ practices in a variety of fields (for example, occupational therapy, dentistry, mental health, pharmacology, gerontology, nursing and so on). The health policy and systems articles usually described or assessed KMb strategies targeting decision-making processes, decision-makers or public health interventions and policies. The continuing education articles assessed training programmes for health professionals aimed at increasing knowledge and skills in a specific field. The articles in the general field described or discussed TMFs and KMb strategies that could be applied to multiple disciplines or contexts. Finally, the articles in the education field described or assessed a KMb strategy targeting education professionals.

Almost half of the articles ( n  = 42, 47.7%) studied KMB strategies implemented in only four countries: Canada, Australia, the United States and the United Kingdom. Countries in South America, the Caribbean, Africa, Asia, the Middle East, China and Europe were underrepresented ( n  = 8, 9.1%). The remaining 34 articles (38.6%) did not specify an implementation context and were mostly theoretical articles. Regarding the authors’ countries of affiliation, Canada, the United States, Australia and the United Kingdom were again the most represented countries, featuring in 85% of the articles ( n  = 75).

What theories, models or frameworks exist in the literature to evaluate KMb strategies?

Several articles proposed a new TMF ( n  = 37, 42.0%), and some articles proposed a logic model specifically developed to evaluate their KMb strategy ( n  = 17, 19.3%). One-third of the articles used existing TMFs ( n  = 28, 31.8%). A few articles only referred to existing TMFs but did not use them to guide a KMb strategy evaluation ( n  = 6, 8.5%).

The identified TMFs were then categorized according to their theoretical approaches (adapted from Nilsen, [ 32 ]) (Table  5 ). Five articles used or proposed more than one TMF, and three TMFs could be classified in two categories. Several articles proposed or used a process model ( n  = 34, 38.6%) or an evaluation framework ( n  = 28, 31.8%); these were the two most frequently identified types of TMFs. Fewer articles proposed or used a logic model ( n  = 17, 19.3%), a determinant framework ( n  = 12, 13.6%) or a classic theory ( n  = 7, 8.0%). The TMFs most often identified in the articles were the RE-AIM framework ( n  = 5, 5.7%), the Knowledge-to-Action framework [ 9 ] ( n  = 4, 4.5%), the Theory of Planned Behavior [ 51 ] ( n  = 3, 3.4%) and the Expanded Outcomes framework for planning and assessing continuing medical education [ 52 ] ( n  = 3, 3.4%). In total, we identified 87 different TMFs in the 88 articles. Only nine TMFS were retrieved in more than one article.

What KMb strategies do the TMFs evaluate (activities, objectives, target audience)?

Thirty-eight articles reported using more than one activity in their KMb strategy. According to the ERIC compilation, “Train and educate stakeholders” activities were the most common, followed by “Develop stakeholder interrelationships” and “Use evaluative and iterative strategies”. Table 6 presents the various types of activities and the number of articles that referred to each.

Of the 88 articles analysed, 18 (20.4%) did not specify a KMb objective. The remaining articles proposed one or more KMb strategy objectives. Specifically, 39 (36.4%) articles had one objective, 15 (17.0%) had two, three (3.4%) had three, and 13 (14.8%) had four or five. Table 7 presents the different types of objectives and the number of times they were identified.

The target audiences for KMb strategies were clearly specified in half of the articles ( n  = 44, 50.0%). Generally, these were empirical articles that targeted specific professionals ( n  = 36, 40.9%) or decision-makers ( n  = 8, 9.1%). Just under one-third of the articles identified a broad target audience (for example, professionals and managers in the health system, a health organization) ( n  = 26, 29.5%). Finally, 18 articles (20.4%) did not specify a target audience for KMb; these were most often theoretical articles.

What are the dimensions and components included in TMFs for evaluating KMb strategies?

The analysis of the identified TMFs revealed many factors of interest relevant for the evaluation of KMb strategies. These specific components were inductively classified into four main dimensions: context, process, effects and impacts (Fig.  2 ). The context dimension refers to the assessment of the conditions in place when the KMb strategy is implemented. These include both the external (that is, sociopolitical, economic, environmental and cultural characteristics) and internal environments (that is, characteristics of organizations, individuals and stakeholder partnerships). These factors are understood to influence the selection and tailoring of a KMb strategy. The process dimension refers to the assessment of the planning, levels and mechanisms of implementation, as well as to the characteristics of the KMb strategy implemented. The effects dimension refers to the assessment of outcomes following the KMb strategy implementation. The potential effects vary depending on the strategy’s objectives and can be either the immediate results of the KMb strategy or short-, medium- and long-term outcomes. The conceptual gradation of effects was generally represented in a similar way in the TMFs analysed, but the temporality of effects could vary. A medium-term outcome in one study could be understood as a long-term outcome in another. However, the majority of authors group these effects into three categories (Gervais et al. 2016: p. 6): (1) short-term effects, measured by success of KMb strategy measured by success of KMb strategy (number of people reached, satisfaction, participation and so on); (2) medium-term effects linked to changes in individual attitude and the use of knowledge; and (3) the long-term effects that result from achieving the KMb objective (for example, improved practices and services, changed collective behaviour, sustainable use of knowledge).

figure 2

The main evaluation dimensions that emerged from the TMFs analysed

Finally, the impacts dimension refers to the ultimate effects of KMb products or interventions on end users, as measured by the organization (Phipps et al. [ 36 ], p. 34). The evaluation of these ultimate effects can be measured by the integration of a promising practice into organizational routines, by the effects on service users or by the effects on the health and well-being of communities and society in general.

This gradation shows the importance of measuring effects at different points in time, to take account of the time they take to appear and their evolving nature (Gervais et al., 2016: p. 6).

Most of the articles presented the dimensions that should be evaluated, whereas the empirical articles presented the dimensions but also used them in practice to evaluate a KMb strategy. Only five articles (5.7%) did not mention specific dimensions that could be classified.

Table 8 presents both the number of articles that presented dimensions to be evaluated and the number of articles that evaluated them in practice. These results showed that the effects dimension was both the most often named and the most evaluated in practice. The other three dimensions (context, process, impacts), while quite often mentioned as relevant to assess, were less often evaluated in practice. For example, only five articles (5.7%) reported having assessed the impacts dimension.

As previously mentioned, the components relevant for the evaluation of KMb strategies were extracted from the identified TMFs. Table 9 presents these components, which represent the more specific factors of interest for assessing context, process, effects and impacts.

Although often overlooked, the evaluation of KMb strategies is an essential step in guiding organizations seeking to determine whether the expected outcomes of their initiatives are being realized. Evaluation not only allows organizations to make adjustments if the initiatives are not producing the expected results, but also helps them to justify their funding of such initiatives. Evaluation is also essential if the KMb science is to truly inform KMb practice, such that the strategies developed are based on empirical data [ 30 ]. To make KMb evaluation more feasible, evaluation must be promoted and practices improved.

This scoping review meets the first objective of our project, which was to provide an overview of reference frameworks used or proposed for evaluating KM strategies, and to propose a preliminary version of a reference framework for evaluating KM strategies. Several key findings emerged from this scoping review:

Proliferation of theories, models and frameworks, but few frequently used

We are seeing a proliferation of TMFs in KMb and closely related fields [ 132 , 133 ]. Thus, the results of this scoping review support the argument that the conceptual and theoretical development of the field is outpacing its empirical development. Most of the reviewed articles (42.0%) proposed a new TMF rather than using existing ones. Furthermore, we identified relatively few empirical studies (50.0%) that focused on the evaluation of KMb strategies. Consequently, the TMFs used were poorly consolidated, which does not provide a solid empirical foundation to guide the evaluation of KMb strategies. Also, not all the TMFs proposed in the articles were specifically developed for evaluation; some were focused on KMb implementation processes. These may still provide elements to consider for evaluation, although they were not designed to propose specific indicators.

A scoping review published in 2018 identified 596 studies using 159 different KMb TMFs, 95 of which had been used only once [ 11 ]. Many authors reported that these are rarely reused and validated [ 11 , 30 , 33 ] and that it is important to test, refine and integrate existing ones [ 3 , 31 , 34 , 133 ]. A clear, collective and consistent use of existing TMFs is recommended and necessary to advance KMb science and closely related fields [ 12 , 31 ]. The systematic review by Strifler et al. [ 11 ] highlights the diversity of available TMFs and the difficulty users may experience when choosing TMFs to guide their KMb initiatives or evaluation process. Future work should focus on the development of tools to better support users of TMFs, especially those working in organizations. By consolidating a large number of TMFs, the results of this scoping review contribute to these efforts.

The importance of improving evaluation practices for complex multifaceted KMb strategies

Another noteworthy finding was the emphasis on the evaluation of strategies focused on education and professional training for practice improvement (52.3%). Relatively few of the reviewed articles looked at, for example, the evaluation of KMb strategies aimed at informing or influencing decision-making (13.6%), or KMb strategies targeting decision-makers (9.1%). These results reaffirm the importance of conducting more large-scale evaluations of complex and multifaceted KMb strategies. These involve a greater degree of interaction and engagement, are composed of networks of multiple actors, mobilize diverse sources of knowledge and have simultaneous multilevel objectives [ 19 , 134 ].

The fact that some KMb strategies are complex interventions implemented in complex contexts [ 134 ] presents a significant and recurring challenge to their evaluation. Methodological designs, approaches and tools are often ill-suited to capture the short-, medium- and long-term outcomes of KMb strategies, as well as to identify the mechanisms by which these outcomes were produced in a specific context. It is also difficult to link concrete changes in practice and decision-making to tangible longer-term impacts at the population level. Moreover, these impacts can take years to be achieved [ 36 ] and can be influenced by several other factors in addition to KMb efforts [ 2 , 19 , 24 ]. Comprehensive, dynamic and flexible evaluation approaches [ 135 , 136 , 137 ] using mixed methods [ 20 ] appear necessary to understand why, for whom, how, when and in what context KMb strategies achieve their objectives [ 2 , 21 , 25 ]. For instance, realist evaluation, which belongs to theory-based evaluation, may be an approach that addresses issues of causality without sacrificing complexity [ 134 , 138 , 139 ]. This evaluation approach aims to identify the underlying generative mechanisms that can explain how the outcomes were generated and what characteristics of the context affected, or not, those mechanisms. This approach is used to test and refine theory about how interventions with a similar logic of action actually work [ 139 ].

Large heterogeneity of methodologies used in empirical studies

Despite the growth of the KMb field, a recurring issue is the relatively limited number of high-quality studies that evaluate KMb outcomes and impacts. This observation is shared by many of the authors of our scoping articles [ 2 , 22 , 23 , 24 , 25 ]. Only a limited number of empirical articles met the selection criteria ( n  = 44/88) in this scoping review. Synthesizing these studies is challenging due to the diversity of research designs used and the large number of potential evaluation components identified. In addition, most of the identified studies used TMFs and measurement tools that were not validated [ 20 , 29 ] and that were specifically developed for their study [ 16 , 25 , 140 ]. Moreover, these studies did not describe the methods used to justify their choice of evaluation dimensions and components [ 25 ], which greatly hinders the ability to draw inferences and develop generalizable theories through replication in similar studies [ 110 , 140 , 141 , 142 , 143 ]. The lack of a widely used evaluation approach across the field is therefore an important issue [ 16 , 20 ] also highlighted by this scoping review.

Our aim in this review was not to identify specific indicators or measurement tools (for example, questionnaires) for assessing KMb strategies, but rather to describe dimensions and component of TMFs used for KMb evaluation. However, a recent scoping review [ 144 ] looked at measurement tools and revealed that only two general potential tools have been identified to assess KMb activities in any sector or organization: the Level of Knowledge Use Survey (LOKUS) [ 145 ] and the Knowledge Uptake and Utilization Tool (KUUT) [ 95 ]. The authors also assert the importance of developing standardized tools and evaluation processes to facilitate comparison of KMb activities’ outcomes across organizations [ 144 ].

Lack of description and reporting of KMb strategies and evaluation

Another important finding from this review was the sparsity of descriptions of KMb strategies in the published articles. In general, the authors provided little information on the operationalization of their KMb strategies (for example, objectives, target audiences, details of activities implemented, implementation context, expected effects). The KMb strategy objectives and the implemented activities should be carefully selected and empirically, theoretically or pragmatically justified before the evaluation components and specific indicators can be determined [ 146 ].

To improve consistency in the field and to contribute to the development of KMb science, many authors reported the need to better describe and report KMb strategies and their context [ 8 , 54 , 146 , 147 , 148 , 149 , 150 ]. KMb strategies are often inconsistently labelled across studies, poorly described and rarely justified theoretically [ 146 , 150 , 151 ]. It was not possible in this scoping review to associate the evaluation components to be used with the objectives and types of KMb strategies, as too much information was missing in the articles. Over the past 10 years, several guidelines have been proposed to improve the reporting of interventions such as KMb strategies: the “Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations checklist” [ 147 ], the “Standards for Reporting Implementation Studies (StaRI)” [ 150 ] and the “Template for Intervention Description and Replication (TIDieR)” [ 152 ]. These guidelines should be used more often to enhance the reporting of KMb strategies and help advance the field [ 153 ].

Implications for future research

This scoping review provides an overview of potential factors of interest for assessing the context, process, effects and impacts of a KMb strategy. It also proposes a preliminary inventory of potential dimensions and components to consider when planning the evaluation of a KMb strategy. Given the broad spectrum of factors of interest identified across studies, not all of them can be assessed in every context. Rather, they should be targeted according to the objectives of the evaluation, the nature of the KMb strategy and the resources available to conduct the evaluation. Thus, this inventory should not be understood as a prescriptive, normative and exhaustive framework, but rather as a toolbox to identify the most relevant factors to include in the evaluation of a given KMB strategy, and to address a need often expressed by organizations wishing to evaluate their KMb efforts.

Additional work is needed to validate and operationalize these dimensions, to identify relevant measurement tools related to the different components and to see how this inventory could support KMb evaluation practices in organizations.

This scoping review is the first stage of a larger research project aimed at improving organizations’ capacity to evaluate their KMb initiatives by developing an integrative, interdisciplinary and easy-to-use reference framework. In the second phase of the project, the relevance and clarity of the evaluation dimensions identified in the scoping review will be validated through a Delphi study with KMb specialists and researchers. The enriched framework will then be pilot tested in two organizations carrying out and evaluating KMb strategies, to adapt the framework to their needs and to further clarify how the dimensions can be measured in practice. In this third phase, guidance will be provided to help organizations adopt the framework and its support kit. The aim of the project is to go beyond proposing a theoretical framework, and to help build organizations’ capacity to evaluate KT strategies by proposing tools adapted to their realities.

Review limitations

Some limitations of this scoping review should be acknowledged. First, given the numerous different terms used to describe and conceptualize the science of using evidence, it is possible that our search strategy did not capture all relevant publications. However, to limit this risk, we manually searched the reference lists of the selected articles. Second, the literature search was limited to articles published in English or French, and the articles were mostly from high-income countries (for example, North America); therefore, the application of the identified concepts in this scoping review to other contexts should be further explored.

In addition, the search strategy focused on scientific publications to assess progress made in the field of knowledge mobilization strategy evaluation. The grey literature was not examined. It should be considered in future research to complete the overview of evaluation needs in the field of knowledge mobilization.

Finally, the paucity of information in the articles sometimes made it difficult to classify the TMFs according to the taxonomies [ 32 , 44 ], which may have led to possible misinterpretation. However, to limit the risk of errors, the categorization was performed by two reviewers and validated by a third in cases of uncertainty.

Given the increasing demand from organizations for the evaluation of KMb strategies, along with the poorly consolidated KMb research field, a scoping review was needed to identify the range, nature and extent of the literature. This scoping review enabled us to synthesize the breadth of the literature, provide an overview of the many theories, models and frameworks used, and identify and categorize the potential dimensions and components to consider when evaluating KMb initiatives. This scoping review is part of a larger research project, in which the next steps will be to validate the integrative framework and develop a support kit to facilitate its use by organizations involved in KMb.

Availability of data and materials

The dataset supporting the conclusions of this article is included within the article and its additional files.


  • Knowledge mobilization
  • Theories, models, and frameworks

Social Sciences and Humanities Research Council. Guidelines for Effective Knowledge Mobilization. 2019. https://www.sshrc-crsh.gc.ca/funding-financement/policies-politiques/knowledge_mobilisation-mobilisation_des_connaissances-eng.aspx Accessed 28 Dec 2022.

Boaz A, Davies H, Fraser A, Nutley S. What works now? evidence-informed policy and practice. Bristol: Policy press; 2019.

Book   Google Scholar  

Curran JA, Grimshaw JM, Hayden JA, Campbell B. Knowledge translation research: the science of moving research into policy and practice. J Contin Educ Heal Prof. 2011;31(3):174–80.

Article   Google Scholar  

Global Commission on Evidence. The Evidence Commission report: A wake-up call and path forward for decision-makers, evidence intermediaries, and impact-oriented evidence producers. McMaster University; 2022 p. 144. https://www.mcmasterforum.org/networks/evidence-commission/report/english

Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

Article   PubMed   PubMed Central   Google Scholar  

Orton L, Lloyd-Williams F, Taylor-Robinson D, O’Flaherty M, Capewell S. The use of research evidence in public health decision making processes: systematic review. PLoS ONE. 2011;6(7): e21704.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Straus SE, Tetroe J, Graham ID, editors. Knowledge translation in health care: moving from evidence to practice. 2nd ed. Chichester, West Sussex ; Hoboken, NJ: Wiley/BMJ Books; 2013, 406

Barwick M, Dubrowski R, Petricca K. Knowledge translation: The rise of implementation. 2020; https://ktdrr.org/products/kt-implementation/KT-Implementation-508.pdf

Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Continuing Educ Health Professions. 2006;26(1):13–24.

Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7(1):50.

Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102.

Article   PubMed   Google Scholar  

Esmail R, Hanson HM, Holroyd-Leduc J, Brown S, Strifler L, Straus SE, et al. A scoping review of full-spectrum knowledge translation theories, models, and frameworks. Implement Sci. 2020;15(1):11.

McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5(1):16.

Fonds de recherche du Québec. Stratégie de mobilisation des connaissances 2014–2017. 2014. https://frq.gouv.qc.ca/en/mobilization-of-knowledge/ . Accessed 28 Dec 2022.

Gervais MJ, Souffez K, Ziam S. Quel impact avons-nous ? Vers l’élaboration d’un cadre pour rendre visibles les retombées du transfert des connaissances. TUC Revue francophone de recherche sur le transfert et l’utilisation des connaissances. 2016;1(2):21.

Google Scholar  

Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatr. 2019;60(4):430–50.

Mitton C, Adair CE, Mckenzie E, Patten SB, Perry BW. Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007;85(4):729–68.

Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14(1):2.

Fazey I, Bunse L, Msika J, Pinke M, Preedy K, Evely AC, et al. Evaluating knowledge exchange in interdisciplinary and multi-stakeholder research. Glob Environ Chang. 2014;25:204–20.

Gervais MJ, Marion C, Dagenais C, Chiocchio F, Houlfort N. Dealing with the complexity of evaluating knowledge transfer strategies: guiding principles for developing valid instruments. Res Eval. 2016;25(1):62–9.

Reed MS, Bryce R, Machen R. Pathways to policy impact: a new approach for planning and evidencing research impact. Evid policy. 2018;14(3):431–58.

Kim C, Wilcher R, Petruney T, Krueger K, Wynne L, Zan T. A research utilisation framework for informing global health and development policies and programmes. Health Res Policy Sys. 2018;16(1):9.

Langer L, Tripney J, Gough D University of London, Social Science Research Unit, Evidence for Policy and Practice Information and Co-ordinating Centre. The science of using science: researching the use of research evidence in decision-making. 2016.

Rajić A, Young I, McEwen SA. Improving the utilization of research knowledge in agri-food public health: a mixed-method review of knowledge translation and transfer. Foodborne Pathog Dis. 2013;10(5):397–412.

Scarlett J, Forsberg BC, Biermann O, Kuchenmüller T, El-Khatib Z. Indicators to evaluate organisational knowledge brokers: a scoping review. Health Res Policy Syst. 2020;18(1):93.

Bornbaum CC, Kornas K, Peirson L, Rosella LC. Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: a systematic review and thematic analysis. Implement Sci. 2015;10(1):162.

Sarkies MN, Bowles KA, Skinner EH, Haas R, Lane H, Haines TP. The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare: a systematic review. Implement Sci. 2017;12(1):132.

Scott SD, Albrecht L, O’Leary K, Ball GD, Hartling L, Hofmeyer A, et al. Systematic review of knowledge translation strategies in the allied health professions. Implement Sci. 2012;7(1):70.

Dagenais C, Malo M, Robert É, Ouimet M, Berthelette D, Ridde V. Knowledge transfer on complex social interventions in public health: a scoping study. PLoS ONE. 2013;8(12): e80233.

Davies HT, Powell AE, Nutley SM. Mobilising knowledge to improve UK health care: learning from other countries and other sectors – a multimethod mapping study. Health Serv Deliv Res. 2015;3(27):1–190.

Damschroder LJ. Clarity out of chaos: use of theory in implementation research. Psychiatry Res. 2020;283: 112461.

Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

Ellen ME, Panisset U, Araujo de Carvalho I, Goodwin J, Beard J. A knowledge translation framework on ageing and health. Health Policy. 2017;121(3):282–91.

Wensing M, Bosch M, Grol R. Developing and selecting interventions for translating knowledge to action. CMAJ. 2010;182(2):E85–8.

Bennet A, Bennet D, Fafard K, Fonda M, Lomond T, Messier L, et al. Knowledge mobilization in the social sciences and humanities: moving from research to action. Frost: MQI Press; 2007.

Phipps D, Cummins J, Pepler D, Craig W, Cardinal S. The Co-produced Pathway to Impact Describes Knowledge Mobilization Processes. JCES. 2016;9(1). https://jces.ua.edu/articles/258 . Accessed 17 Nov 2022.

Birken SA, Rohweder CL, Powell BJ, Shea CM, Scott J, Leeman J, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol. 2018;18(1):143.

Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73.

Moullin JC, Sabater-Hernandez D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13(101170481):16.

Pham MT, Rajić A, Greig JD, Sargeant JM, Papadopoulos A, McEwen SA. A scoping review of scoping reviews: advancing the approach and enhancing the consistency. Res Synthesis Methods. 2014;5(4):371–85.

Farkas M, Jette AM, Tennstedt S, Haley SM, Quinn V. Knowledge dissemination and utilization in gerontology: an organizing framework. Gerontologist. 2003;43:47–56.

Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):109.

Smith JD, Li DH, Rafferty MR. The implementation research logic model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15(1):84.

Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–58.

Sketris IS, Carter N, Traynor RL, Watts D, Kelly K, following contributing members of the CNODES Knowledge Translation Team: Pierre Ernst JG Brenda Hemmelgarn, Colleen Metge, Michael Paterson, Robert Platt W and Gary Teare. Building a framework for the evaluation of knowledge translation for the Canadian Network for Observational Drug Effect Studies. Pharmacoepidemiol Drug Saf. 2020;29 Suppl 1(d0r, 9208369):8–25.

Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27(2):237–46.

Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211.

Moore DE, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1–15.

Tschida JE, Drahota A. Fidelity to the ACT SMART Toolkit: an instrumental case study of implementation strategy fidelity. Implement Sci Commun. 2023;4(1):52.

Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9(1):781.

Bertone MP, Meessen B, Clarysse G, Hercot D, Kelley A, Kafando Y, et al. Assessing communities of practice in health policy: a conceptual framework as a first step towards empirical research. Health Res Policy Sys. 2013;11(1):39.

Gagliardi AR, Legare F, Brouwers MC, Webster F, Wiljer D, Badley E, et al. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach. Implement Sci. 2011;6(101258411):25.

Sargeant J, Borduas F, Sales A, Klein D, Lynn B, Stenerson H. CPD and KT: models used and opportunities for synergy. J Contin Educ Heal Prof. 2011;31(3):167–73.

Stetler CB, Ritchie J, Rycroft-Malone J, Schultz A, Charns M. Improving quality of care through routine, successful implementation of evidence-based practice at the bedside: an organizational case study protocol using the Pettigrew and Whipp model of strategic change. Implement Sci. 2007;2(101258411):3.

Kok MO, Schuit AJ. Contribution mapping: a method for mapping the contribution of research to enhance its impact. Health Res Policy Syst. 2012;10(101170481):21.

Dadich A. From bench to bedside: methods that help clinicians use evidence-based practice. Aust Psychol. 2010;45(3):197–211.

Brown P, Bahri P. Engagement’ of patients and healthcare professionals in regulatory pharmacovigilance: establishing a conceptual and methodological framework. Eur J Clin Pharmacol. 2019;75(9):1181–92.

Article   CAS   PubMed   Google Scholar  

Dobbins M, Ciliska D, Cockerill R, Barnsley J, DiCenso A. A framework for the dissemination and utilization of research for health-care policy and practice. Worldviews Evid Based Nurs Presents Arch Online J Knowl Synthesis Nurs. 2002;9(1):149–60.

Gagliardi AR, Brouwers MC, Bhattacharyya OK. The guideline implementability research and application network (GIRAnet): an international collaborative to support knowledge exchange: study protocol. Implement Sci. 2012;7(101258411):26.

Brooks SP, Zimmermann GL, Lang M, Scott SD, Thomson D, Wilkes G, et al. A framework to guide storytelling as a knowledge translation intervention for health-promoting behaviour change. Implement sci commun. 2022;3(1):35.

Cullen L, Hanrahan K, Edmonds SW, Reisinger HS, Wagner M. Iowa implementation for sustainability framework. Implement Sci. 2022;17(1):1.

Labbé D, Mahmood A, Miller WC, Mortenson WB. Examining the impact of knowledge mobilization strategies to inform urban stakeholders on accessibility: a mixed-methods study. Int J Environ Res Public Health. 2020;17(5):1561.

Straus SE, Tetroe J, Graham ID, Zwarenstein M, Bhattacharyya O, Shepperd S. Monitoring use of knowledge and evaluating outcomes. Can Med Assoc J. 2010;182(2):E94–8.

Bennett S, Whitehead M, Eames S, Fleming J, Low S, Caldwell E. Building capacity for knowledge translation in occupational therapy: learning through participatory action research. BMC Med Educ. 2016;16(1):257.

Brown C, Rogers S. Measuring the effectiveness of knowledge creation as a means of facilitating evidence-informed practice in early years settings in one London Borough. Lond Rev Educ. 2014;12(3):245–60.

Talbott E, De Los RA, Kearns DM, Mancilla-Martinez J, Wang M. Evidence-based assessment in special education research: advancing the use of evidence in assessment tools and empirical processes. Except Child. 2023;89(4):467–87.

Rosella LC, Bornbaum C, Kornas K, Lebenbaum M, Peirson L, Fransoo R, et al. Evaluating the process and outcomes of a knowledge translation approach to supporting use of the Diabetes Population Risk Tool (DPoRT) in public health practice. Canadian J Program Eval. 2018;33(1):21–48.

Couineau AL, Forbes D. Using predictive models of behavior change to promote evidence-based treatment for PTSD. Psychol Trauma Theory Res Pract Policy. 2011;3(3):266–75.

Dufault M. Testing a collaborative research utilization model to translate best practices in pain management. Worldviews Evid Based Nurs. 2004;1:S26-32.

Beckett K, Farr M, Kothari A, Wye L, le May A. Embracing complexity and uncertainty to create impact: exploring the processes and transformative potential of co-produced research through development of a social impact model. Health Res Policy Syst. 2018;16(1):118.

Kramer DM, Wells RP, Carlan N, Aversa T, Bigelow PP, Dixon SM, et al. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety. Int J Occup Saf Ergon. 2013;19(1):41–62.

Duhamel F, Dupuis F, Turcotte A, Martinez AM, Goudreau J. Integrating the illness beliefs model in clinical practice: a family systems nursing knowledge utilization model. J FAM NURS. 2015;21(2):322–48.

Wimpenny P, Johnson N, Walter I, Wilkinson JE. Tracing and identifying the impact of evidence-use of a modified pipeline model. Worldviews Evid Based Nurs. 2008;5(1):3–12.

Ward V, Smith S, House A, Hamer S. Exploring knowledge exchange: a useful framework for practice and policy. Soc Sci Med. 2012;74(3):297–304.

Grooten L, Vrijhoef HJM, Alhambra-Borras T, Whitehouse D, Devroey D. The transfer of knowledge on integrated care among five European regions: a qualitative multi-method study. BMC Health Serv Res. 2020;20(1):11.

Stetler CB. Updating the Stetler Model of research utilization to facilitate evidence-based practice. Nurs Outlook. 2001;49(6):272–9.

Ward V. Why, whose, what and how? A framework for knowledge mobilisers. Evid Policy J Res Debate Pract. 2017;13(3):477–97.

Levin RF, Fineout-Overholt E, Melnyk BM, Barnes M, Vetter MJ. Fostering evidence-based practice to improve nurse and cost outcomes in a community health setting: a pilot test of the advancing research and clinical practice through close collaboration model. Nurs Adm Q. 2011;35(1):21–33.

Currie M, King G, Rosenbaum P, Law M, Kertoy M, Specht J. A model of impacts of research partnerships in health and social services. Eval Program Plann. 2005;28(4):400–12.

Richard L, Chiocchio F, Essiembre H, Tremblay MC, Lamy G, Champagne F, et al. Communities of practice as a professional and organizational development strategy in local public health organizations in Quebec, Canada: an evaluation model. Healthc Policy. 2014;9(3):26–39.

PubMed   PubMed Central   Google Scholar  

Rycroft-Malone J, Wilkinson J, Burton CR, Harvey G, McCormack B, Graham I, et al. Collaborative action around implementation in collaborations for leadership in applied health research and care: towards a programme theory. J Health Serv Res Policy. 2013;18(3 Suppl):13–26.

Gagliardi AR, Fraser N, Wright FC, Lemieux-Charles L, Davis D. Fostering knowledge exchange between researchers and decision-makers: exploring the effectiveness of a mixed-methods approach. Health Policy. 2008;86(1):53–63.

Paquette-Warren J, Harris SB, Naqshbandi Hayward M, Tompkins JW. Case study of evaluations that go beyond clinical outcomes to assess quality improvement diabetes programmes using the Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE). J Eval Clin Pract. 2016;22(5):644–52.

Paquette-Warren J, Tyler M, Fournie M, Harris SB. The diabetes evaluation framework for innovative national evaluations (DEFINE): construct and content validation using a modified Delphi method. Can J diabetes. 2017;41(3):281–96.

Abbot ML, Lee KK, Rossiter MJ. Evaluating the effectiveness and functionality of professional learning communities in adult ESL Programs. TESL Canada J. 2018;35(2):1–25.

Ho K, Bloch R, Gondocz T, Laprise R, Perrier L, Ryan D, et al. Technology-enabled knowledge translation: frameworks to promote research and practice. J Contin Educ Heal Prof. 2004;24(2):90–9.

Yu X, Hu D, Li N, Xiao Y. Comprehensive evaluation on teachers’ knowledge sharing behavior based on the improved TOPSIS method. Comput Intell Neurosci. 2022;2022(101279357):2563210.

Arora S, Kalishman SG, Thornton KA, Komaromy MS, Katzman JG, Struminger BB, et al. Project ECHO: a telementoring network model for continuing professional development. J Contin Educ Health Prof. 2017;37(4):239–44.

Smidt A, Balandin S, Sigafoos J, Reed VA. The Kirkpatrick model: a useful tool for evaluating training outcomes. J Intellect Dev Disabil. 2009;34(3):266–74.

Jeffs L, Sidani S, Rose D, Espin S, Smith O, Martin K, et al. Using theory and evidence to drive measurement of patient, nurse and organizational outcomes of professional nursing practice. Int J Nurs Pract. 2013;19(2):141–8.

Skinner K. Developing a tool to measure knowledge exchange outcomes. Can J Program Eval. 2007;22(1):49–75.

Lavis J, Ross S, McLeod C, Gildiner A. Measuring the impact of health research. J Health Serv Res Policy. 2003;8(3):165–70.

Boyko JA, Lavis JN, Abelson J, Dobbins M, Carter N. Deliberative dialogues as a mechanism for knowledge translation and exchange in health systems decision-making. Soc Sci Med. 2012;75(11):1938–45.

Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

Gainforth HL, Latimer-Cheung AE, Athanasopoulos P, Martin Ginis KA. Examining the feasibility and effectiveness of a community-based organization implementing an event-based knowledge mobilization initiative to promote physical activity guidelines for people with spinal cord injury among support personnel. Health Promot Pract. 2015;16(1):55–62.

Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019. https://doi.org/10.3389/fpubh.2019.00064 .

Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8(101616579):134.

Bender BG, Simmons B, Konkoly N, Liu AH. The asthma toolkit bootcamp to improve rural primary care for pediatric asthma. J Allergy Clin Immunol Pract. 2021;9(8):3091-3097.e1.

de la Garza Iga FJ, Mejia Alvarez M, Cockroft JD, Rabin J, Cordon A, Elias Rodas DM, et al. Using the project ECHO TM model to teach mental health topics in rural Guatemala: an implementation science-guided evaluation. Int J Soc Psychiatry. 2023;69(8):2031–41.

Alkin M, Taut S. Unbundling evaluation use. Stud Educ Eval. 2003;29(1):1–12.

Varallyay NI, Langlois EV, Tran N, Elias V, Reveiz L. Health system decision-makers at the helm of implementation research: development of a framework to evaluate the processes and effectiveness of embedded approaches. Health Res Policy Syst. 2020;18(1):64.

McCabe KE, Wallace A, Crosland A. A model for collaborative working to facilitate knowledge mobilisation in public health. Evid Policy. 2015;11(4):559–76.

Gonzales R, Handley MA, Ackerman S, O’sullivan PS. A framework for training health professionals in implementation and dissemination science. Acad Med. 2012;87(3):271–8.

Edgar L, Herbert R, Lambert S, MacDonald JA, Dubois S, Latimer M. The joint venture model of knowledge utilization: a guide for change in nursing. Nurs Leadersh. 2006;9(2):41–55.

Stetler CB, Damschroder LJ, Helfrich CD, Hagedorn HJ. A Guide for applying a revised version of the PARIHS framework for implementation. Implement Sci. 2011;6(101258411):99.

Brennan SE, Cumpston M, Misso ML, McDonald S, Murphy MJ, Green SE. Design and formative evaluation of the policy liaison initiative: a long-term knowledge translation strategy to encourage and support the use of cochrane systematic reviews for informing. Evid Policy. 2016;12(1):25–52.

Hinchcliff R, Senserrick T, Travaglia J, Greenfield D, Ivers R. The enhanced knowledge translation and exchange framework for road safety: a brief report on its development and potential impacts. Inj Prev. 2017;23(2):114–7.

Ye J, Woods D, Bannon J, Bilaver L, Kricke G, McHugh M, et al. Identifying contextual factors and strategies for practice facilitation in primary care quality improvement using an informatics-driven model: framework development and mixed methods case study. JMIR Hum Factors. 2022;9(2): e32174.

Brangan J, Quinn S, Spirtos M. Impact of an evidence-based practice course on occupational therapist’s confidence levels and goals. Occup Ther Health Care. 2015;29(1):27–38.

Bonetti D, Johnston M, Pitts NB, Deery C, Ricketts I, Tilley C, et al. Knowledge may not be the best target for strategies to influence evidence-based practice: using psychological models to understand RCT effects. Int J Behav Med. 2009;16(3):287–93.

Buckley LL, Goering P, Parikh SV, Butterill D, Foo EKH. Applying a “stages of change” model to enhance a traditional evaluation of a research transfer course. J Eval Clin Pract. 2003;9(4):385–90.

Boyko JA, Lavis JN, Dobbins M, Souza NM. Reliability of a tool for measuring theory of planned behaviour constructs for use in evaluating research use in policymaking. Health Res Policy Syst. 2011;24(9):29.

Imani-Nasab MH, Yazdizadeh B, Salehi M, Seyedin H, Majdzadeh R. Validity and reliability of the Evidence Utilisation in Policymaking Measurement Tool (EUPMT). Health Res Policy Syst. 2017;15(1):66.

Dwan KM, McInnes P, Mazumdar S. Measuring the success of facilitated engagement between knowledge producers and users: a validated scale. Evid Policy. 2015;11(2):239–52.

Haynes A, Rowbotham S, Grunseit A, Bohn-Goldbaum E, Slaytor E, Wilson A, et al. Knowledge mobilisation in practice: an evaluation of the Australian Prevention Partnership Centre. Health Res Policy Sys. 2020;18(1):13.

Haines M, Brown B, Craig J, D’Este C, Elliott E, Klineberg E, et al. Determinants of successful clinical networks: the conceptual framework and study protocol. Implement Sci. 2012;7(101258411):16.

Ko LK, Jang SH, Friedman DB, Glanz K, Leeman J, Hannon PA, et al. An application of the science impact framework to the cancer prevention and control research network from 2014–2018. Prev Med. 2019;12: 105821.

Leeman J, Sommers J, Vu M, Jernigan J, Payne G, Thompson D, et al. An evaluation framework for obesity prevention policy interventions. Prev Chronic Dis. 2012;9(101205018):E120.

Pettman TL, Armstrong R, Waters E, Allender S, Love P, Gill T, et al. Evaluation of a knowledge translation and exchange platform to advance non-communicable disease prevention. Evid Policy. 2016;12(1):109–26.

Yearwood AC. Applying a logical theory of change for strengthening research uptake in policy: a case study of the Evidence Informed Decision Making Network of the Caribbean. Rev Panam Salud Publica. 2018;42: e91.

Thomson D, Brooks S, Nuspl M, Hartling L. Programme theory development and formative evaluation of a provincial knowledge translation unit. Health Res Policy Syst. 2019;17(1):40.

Garad R, Kozica-Olenski S, Teede HJ. Evaluation of a center of research excellence in polycystic ovary syndrome as a large-scale collaborative research translation initiative, including evaluating translation of guideline impact. Semin Reprod Med. 2018;36(1):42–9.

Reddy S, Wakerman J, Westhorp G, Herring S. Evaluating impact of clinical guidelines using a realist evaluation framework. J Eval Clin Pract. 2015;21(6):1114–20.

Van Eerd D, Moser C, Saunders R. A research impact model for work and health. Am J Ind Med. 2021;64(1):3–12.

Yip O, Huber E, Stenz S, Zullig LL, Zeller A, De Geest SM, et al. A contextual analysis and logic model for integrated care for frail older adults living at home: The INSPIRE Project. Int J Integr Care. 2021;21(2):9.

Guo R, Bain BA, Willer J. Application of a logic model to an evidence-based practice training program for speech-language pathologists and audiologists. J Allied Health. 2011;40(1):e23–8.

PubMed   Google Scholar  

McDonald S, Turner T, Chamberlain C, Lumbiganon P, Thinkhamrop J, Festin MR, et al. Building capacity for evidence generation, synthesis and implementation to improve the care of mothers and babies in South East Asia: methods and design of the SEA-ORCHID Project using a logical framework approach. BMC Med Res Methodol. 2010;10(100968545):61.

Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med. 2019;17(1):88.

Kreindler SA. Advancing the evaluation of integrated knowledge translation. Health Res Policy Sys. 2018;16(1):104.

Best A, Holmes B. Systems thinking, knowledge and action: towards better models and methods. Evid Policy. 2010;6(2):145–59.

van Mil HGJ, Foegeding EA, Windhab EJ, Perrot N, van der Linden E. A complex system approach to address world challenges in food and agriculture. Trends Food Sci Technol. 2014;40(1):20–32.

Wehrens R. Beyond two communities – from research utilization and knowledge translation to co-production? Public Health. 2014;128(6):545–51.

Ridde V, Pérez D, Robert E. Using implementation science theories and frameworks in global health. BMJ Glob Health. 2020;5(4): e002269.

Salter KL, Kothari A. Using realist evaluation to open the black box of knowledge translation: a state-of-the-art review. Implement Sci. 2014;9(1):115.

Van Eerd D, Cole D, Keown K, Irvin E, Kramer D, Gibson B, et al. Report on knowledge transfer and exchange practices: A systematic review of the quality and types of instruments used to assess KTE implementation and impact. Toronto: Institute for Work & Health; 2011 p. 130. https://www.iwh.on.ca/sites/iwh/files/iwh/reports/iwh_sys_review_kte_evaluation_tools_2011_rev.pdf

Dobbins M, Robeson P, Ciliska D, Hanna S, Cameron R, O’Mara L, et al. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci. 2009;4(1):23.

Rychetnik L, Bauman A, Laws R, King L, Rissel C, Nutbeam D, et al. Translating research for evidence-based public health: key concepts and future directions. J Epidemiol Community Health. 2012;66(12):1187–92.

Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.

Bhawra J, Skinner K. Examination of tools associated with the evaluation of knowledge uptake and utilization: a scoping review. Eval Program Plann. 2020;83: 101875.

Lane JP, Stone VI, Nobrega A, Tomita M. Level Of Knowledge Use Survey (LOKUS): a validated instrument for tracking knowledge uptake and use. Stud Health Technol Inform. 2015;217:106–10.

Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.

Albrecht L, Archibald M, Arseneau D, Scott SD. Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations. Implement Sci. 2013;8(1):52.

Bragge P, Grimshaw JM, Lokker C, Colquhoun H, Albrecht L, Baron J, et al. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17(1):38.

Kastner M, Makarski J, Hayden L, Lai Y, Chan J, Treister V, et al. Improving KT tools and products: development and evaluation of a framework for creating optimized, Knowledge-activated Tools (KaT). Implement Sci Commun. 2020;1(1):47.

Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;6(356): i6795.

Lokker C, McKibbon KA, Colquhoun H, Hempel S. A scoping review of classification schemes of interventions to promote and integrate evidence into practice in healthcare. Implement Sci. 2015;10(1):27.

Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;7(348): g1687.

Wilson PM, Sales A, Wensing M, Aarons GA, Flottorp S, Glidewell L, et al. Enhancing the reporting of implementation research. Implement Sci. 2017;12(1):13.

Download references


We wish to thank Julie Desnoyers for designing and implementing the search strategy, Gabrielle Legendre for her contribution in the screening phase and Karine Souffez and Caroline Tessier for their input during the project.

This project was supported by an Insight Grant from the Social Sciences and Humanities Research Council of Canada (SSHRC) and by the Équipe RENARD (FRQ-SC). The funding bodies had no role in the conduct of this scoping review.

Author information

Authors and affiliations.

School of Business Administration, Université TÉLUQ, Montreal, Canada

Saliha Ziam & Laura Justine Chouinard

Department of School and Social Adaptation Studies, Faculty of Education, Université de Sherbrooke, Sherbrooke, Canada

Sèverine Lanoue, Esther McSween-Cadieux, Julie Lane & Ollivier Prigent

Department of Psychology, Université du Québec à Montréal, Montreal, Canada

Mathieu-Joël Gervais

Centre RBC d’expertise Universitaire en Santé Mentale, Université de Sherbrooke, Sherbrooke, Canada

School of Rehabilitation, Faculty of Medicine, Université de Montréal, Montreal, Canada

Dina Gaid & Quan Nha Hong

Department of Psychology, Université de Montréal, Montreal, Canada

Christian Dagenais

Université Paris Cité, IRD (Institute for Research on Sustainable Development, CEPED, Paris, France

Valéry Ridde

Institute of Health and Development (ISED), Cheikh Anta Diop University, Dakar, Senegal

Public Health Intelligence and Knowledge Translation Division, Public Health Agency of Canada, Ottawa, Canada

Emmanuelle Jean

Coordinator of the Interregional Consortium of Knowledge in Health and Social Services (InterS4), Rimouski, Canada

France Charles Fleury

You can also search for this author in PubMed   Google Scholar


SZ, MJG, EMC, JL, CD, EJ, KS, VR and CT were involved in developing and designing the scoping review. EMC, MJG and GL (collaborator) screened articles in duplicate. SL, DG, LJC and OP extracted data from the included articles. SL and DG synthesized the data. SL, SZ and EMC drafted the manuscript. SZ led the project, supervised and assisted the research team at every stage, and secured the funding. All authors provided substantive feedback and approved the manuscript prior to submission.

Corresponding author

Correspondence to Saliha Ziam .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

Keywords and search strategy.

Additional file 2.

Summary of included articles.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ziam, S., Lanoue, S., McSween-Cadieux, E. et al. A scoping review of theories, models and frameworks used or proposed to evaluate knowledge mobilization strategies. Health Res Policy Sys 22 , 8 (2024). https://doi.org/10.1186/s12961-023-01090-7

Download citation

Received : 16 June 2023

Accepted : 07 December 2023

Published : 10 January 2024

DOI : https://doi.org/10.1186/s12961-023-01090-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Knowledge translation
  • Scoping review

Health Research Policy and Systems

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

proposed methodology

The Federal Register

The daily journal of the united states government, request access.

Due to aggressive automated scraping of FederalRegister.gov and eCFR.gov, programmatic access to these sites is limited to access to our extensive developer APIs.

If you are human user receiving this message, we can add your IP address to a set of IPs that can access FederalRegister.gov & eCFR.gov; complete the CAPTCHA (bot test) below and click "Request Access". This process will be necessary for each IP address you wish to access the site from, requests are valid for approximately one quarter (three months) after which the process may need to be repeated.

An official website of the United States government.

If you want to request a wider IP range, first request access for your current IP, and then use the "Site Feedback" button found in the lower left-hand side to make the request.

RANSAC-based instantaneous real-time kinematic positioning with GNSS triple-frequency signals in urban areas

  • Original Article
  • Published: 06 April 2024
  • Volume 98 , article number  24 , ( 2024 )

Cite this article

  • Qi Cheng 1 ,
  • Wu Chen 1 ,
  • Rui Sun   ORCID: orcid.org/0000-0003-2252-9944 2 ,
  • Junhui Wang 1 &
  • Duojie Weng 1  

The demand for high-precision positioning has risen substantially in modern urban settings. In that regard, Global Navigation Satellite Systems (GNSS) offer several advantages such as global coverage, real-time capability, high accuracy, ease of use, and cost-effectiveness. The accuracy of GNSS-based positioning, however, suffers in urban environments due to signal blockage, reflection, and diffraction, which makes it difficult to fix ambiguities correctly within a real-time kinematic (RTK). To address this issue, this paper applies random sample consensus (RANSAC) to develop a novel single-epoch triple-frequency RTK positioning method. In our proposed method, the ambiguities of the extra-wide-lane, wide-lane, and original frequencies are resolved sequentially. RANSAC then detects and excludes incorrectly fixed ambiguities. To validate the effectiveness of the proposed method, two static experiments (cases 1 and 2) and one dynamic experiment (case 3) were conducted in representative urban areas. The findings demonstrate that the proposed method outperforms all comparative methods in positional availability, with comparable positional accuracy in terms of root-mean-square errors (RMSEs). In cases 1, 2, and 3, the proposed method achieves 3D RMSEs of 2.74, 4.29, and 20.35 cm, and the positional availabilities of 100%, 75.0%, and 73.1%, using a 10-degree mask angle (and a carrier-to-noise ratio ( C / N 0 ) threshold 35 dB-Hz). The corresponding RMSEs (positional availabilities) of comparative methods are from 1.51 to 4.04 cm (75.7 to 96.3%) in case 1, 4.19 to 7.78 cm (34.5 to 49.9%) in case 2, and 23.52 to 37.54 cm (15.4 to 33.9%) in case 3, respectively. Compared to these methods, the proposed method shows improvements of positional availabilities between 3.7 and 24.3 percentage points in case 1, between 25.1 and 40.5 percentage points in case 2, and between 39.2 and 57.7 percentage points in case 3.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

proposed methodology

Data availability

The data that support the results of this study are available from the corresponding author for academic purposes on reasonable request.

Amiri-Simkooei AR, Jazaeri S, Zangeneh-Nejad F, Asgari J (2016) Role of stochastic model on GPS integer ambiguity resolution success rate. GPS Solut 20:51–61

Article   Google Scholar  

Bai X, Wen W, Zhang G, Hsu LT (2019) Real-time GNSS NLOS detection and correction aided by sky-pointing camera and 3D LiDAR. In: Proceedings of the ION 2019 Pacific PNT Meeting, Honolulu, Hawaii, April, pp 862–874

Bossler JD, Goad CC, Bender PL (1980) Using the global positioning system (GPS) for geodetic positioning. Bull Géod 54:553–563

Castaldo G, Angrisano A, Gaglione S, Troisi S (2014) P-RANSAC: an integrity monitoring approach for GNSS signal degraded scenario. Int J Navig Obs. https://doi.org/10.1155/2014/173818

Chang XW, Yang X, Zhou T (2005) MLAMBDA: a modified LAMBDA method for integer least-squares estimation. J Geod 79:552–565

Chen YZ, Wu J (2013) Zero-correlation transformation and threshold for efficient GNSS carrier phase ambiguity resolution. J Geod 87:971–979

Cocard M, Bourgon S, Kamali O, Collins P (2008) A systematic investigation of optimal carrier-phase combinations for modernized triple-frequency GPS. J Geod 82(9):555–564

Daneshmand S, Broumandan A, Sokhandan N, Lachapelle G (2013) GNSS multipath mitigation with a moving antenna array. IEEE Trans Aerosp Electron Syst 49(1):693–698

De Jonge PJ, Teunissen PJG, Jonkman NF, Joosten P (2000) The distributional dependence of the range on triple frequency GPS ambiguity resolution. In: Proceedings of the ION NTM 2000, 26–28 January, Anaheim, CA, pp 605–612

Euler H, Schaffrin B (1991) On a measure for the discernibility between different ambiguity solutions in the static-kinematic GPS-mode. In: Kinematic Systems in Geodesy, Surveying, and Remote Sensing: Symposium No. 107 Banff, Alberta, Canada, 10–13 September, 1990 pp 285–295

Feng Y (2008) GNSS three carrier ambiguity resolution using ionosphere-reduced virtual signals. J Geod 82(12):847–862

Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

Forssell B, Martin-Neira M, Harrisz RA (1997) Carrier phase ambiguity resolution in GNSS-2. In: Proceedings of the ION GPS-97, Kansas City, 16–19 September, pp 1727–1736

Groves PD, Adjrad M (2017) Likelihood-based GNSS positioning using LOS/NLOS predictions from 3D mapping and pseudoranges. GPS Solut 21(4):1805–1816

Groves PD, Jiang Z (2013) Height aiding, C/N0 weighting and consistency checking for GNSS NLOS and multipath mitigation in urban areas. J Navig 66(5):653–669

Groves PD, Jiang Z, Wang L, Ziebart MK (2012) Intelligent urban positioning using multi-constellation GNSS with 3D mapping and NLOS signal detection. In: Proceedings of the 25th international technical meeting of the satellite division of the Institute of Navigation (ION GNSS 2012), Nashville, TN, September, pp 458–472

Hassan T, Fath-Allah T, Elhabiby M, Awad A, El-Tokhey M (2022) Detection of GNSS no-line of sight signals using LiDAR sensors for intelligent transportation systems. Surv Rev 54(385):301–309

Hatch R, Jung J, Enge P, Pervan B (2000) Civilian GPS: the benefits of three frequencies. GPS Solut 3(4):1–9

Horide K, Yoshida A, Hirata R, Kubo Y, Koya Y (2019) NLOS Satellite detection using fish-eye camera and semantic segmentation for improving GNSS positioning accuracy in Urban Area. In: Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications. Kyoto, November, vol 2019, pp 212–217

Hsu LT, Gu Y, Kamijo S (2015) NLOS correction/exclusion for GNSS measurement using RAIM and city building models. Sensors 15(7):17329–17349

Hsu LT (2017) GNSS multipath detection using a machine learning approach. In: 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October, pp 1–6

Ji S, Chen W, Zhao C, Ding X, Chen Y (2007) Single epoch ambiguity resolution for Galileo with the CAR and LAMBDA methods. GPS Solut 11(4):259–268

Ji S, Zheng Q, Weng D, Chen W, Wang Z, He K (2021) Single epoch ambiguity resolution of small-scale CORS with multi-frequency GNSS. Remote Sens 14(1):13

Jiang Z, Groves PD (2014) NLOS GPS signal detection using a dual-polarisation antenna. GPS Solut 18:15–26

Li P, Zhang X (2015) Precise point positioning with partial ambiguity fixing. Sensors 15(6):13627–13643

Li X, Li X, Huang J, Shen Z, Wang B, Yuan Y, Zhang K (2021) Improving PPP–RTK in urban environment by tightly coupled integration of GNSS and INS. J Geod 95:1–18

Li Z, Xu G, Guo J, Zhao Q (2022) A sequential ambiguity selection strategy for partial ambiguity resolution during RTK positioning in urban areas. GPS Solut 26(3):92

Article   CAS   Google Scholar  

Liu L, Hsu HT, Zhu YZ, Ou JK (1999) A new approach to GPS ambiguity decorrelation. J Geod 73:478–490

Liu X, Zhang S, Zhang Q, Zheng N, Zhang W, Ding N (2022) A novel partial ambiguity resolution based on ambiguity dilution of precision-and convex-hull-based satellite selection for instantaneous multiple global navigation satellite systems positioning. J Navig 75(4):832–848

MacDoran PF (1979) Satellite emission radio interferometric earth surveying series—GPS geodetic system. Bull Geod 53:117–138

Meguro J, Murata T, Takiguchi J, Amano Y, Hashizume T (2009) GPS multipath mitigation for urban area using omnidirectional infrared camera. IEEE Trans Intell Transp Syst 10(1):22–30

Moreau J, Ambellouis S, Ruichek Y (2017) Fisheye-based method for GPS localization improvement in unknown semi-obstructed areas. Sensors 17(1):119

Ng HF, Zhang G, Hsu LT (2020) A computation effective range-based 3D mapping aided GNSS with NLOS correction method. J Navig 73(6):1202–1222

Parkins A (2011) Increasing GNSS RTK availability with a new single-epoch batch partial ambiguity resolution algorithm. GPS Solut 15:391–402

Parkinson BW, Axelrad P (1988) Autonomous GPS integrity monitoring using the pseudorange residual. Navigation 35(2):255–274

Peyret F, Bétaille D, Carolina P, Toledo-Moreo R, Gómez-Skarmeta AF, Ortiz M (2014) GNSS autonomous localization: NLOS satellite detection based on 3-d maps. IEEE Robot Autom Mag 21(1):57–63

Sánchez JS, Gerhmann A, Thevenon P, Brocard P, Afia AB, Julien O (2016) Use of a fisheye camera for GNSS NLOS exclusion and characterization in urban environments. In: Proceedings of the 2016 international technical meeting of the institute of navigation, Monterey, California, January, pp 283–292

Shytermeja E, Garcia-Pena A, Julien O (2014) Proposed architecture for integrity monitoring of a GNSS/MEMS system with a fisheye camera in urban environment. In: International Conference on Localization and GNSS 2014 (ICL-GNSS 2014). Helsinki, Finland, 24–26 June, pp 1–6

Song J, Hou C, Xue G, Ma M (2016) Study of constellation design of pseudolites based on improved adaptive genetic algorithm. J Commun 11(9):879–885

Google Scholar  

Sun R, Wang G, Zhang W, Hsu LT, Ochieng WY (2020) A gradient boosting decision tree based GPS signal reception classification algorithm. Appl Soft Comput 86:105942

Sun R, Fu L, Wang G, Cheng Q, Hsu LT, Ochieng WY (2021) Using dual-polarization GPS antenna with optimized adaptive neuro-fuzzy inference system to improve single point positioning accuracy in urban canyons. Navigation 68(1):41–60

Suzuki T, Amano Y (2021) NLOS multipath classification of GNSS signal correlation output using machine learning. Sensors 21(7):2503

Suzuki T, Matsuo K, Amano Y (2020) Rotating GNSS antennas: simultaneous LOS and NLOS multipath mitigation. GPS Solut 24(3):1–13

Teunissen P (1995) The least-square ambiguity decorrelation adjustment: a method for fast GPS integer ambiguity estimation. J Geod 70(1):65–82

Teunissen P, Joosten P, Tiberius C (2002) A comparison of TCAR, CIR and LAMBDA GNSS ambiguity resolution. In: Proceedings of the 15th international technical meeting of the satellite division of the institute of navigation (ION GPS 2002), Portland, pp 2799–2808

Teunissen P (1993) Least-squares estimation of the integer GPS ambiguities. In: Invited lecture, section IV theory and methodology, IAG general meeting, Beijing, China, pp 1–16

Teunissen P (2001) GNSS ambiguity bootstrapping: theory and application. In: Proceedings of international symposium on kinematic systems in Geodesy, geomatics and navigation, pp 246–254

Tokura H, Kubo N (2016) Effective satellite selection methods for RTK-GNSS NLOS exclusion in dense urban environments. In: Proceedings of the 29th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS+ 2016), Portland, Oregon, September, pp 304–312

Tu R, Liu J, Lu C, Zhang R, Zhang P, Lu X (2017) Cooperating the BDS, GPS, GLONASS and strong-motion observations for real-time deformation monitoring. Geophys J Int 209(3):1408–1417

Tu R, Liu J, Zhang R, Zhang P, Huang X, Lu X (2019) RTK model and positioning performance analysis using Galileo four-frequency observations. Adv Space Res 63(2):913–926

Vagle N, Broumandan A, Jafarnia-Jahromi A, Lachapelle G (2016) Performance analysis of GNSS multipath mitigation using antenna arrays. J Global Position Syst 14(1):1–15

Verhagen S, Li B, Geodesy M (2012) LAMBDA software package: Matlab implementation, Version 3.0. Delft University of Technology and Curtin University, Perth, Australia

Vollath U, Birnbach S, Landau H, Fraile-Ordonez JM, Martin-Neira M (1998) Analysis of three-carrier ambiguity resolution (TCAR) technique for precise relative positioning in GNSS-2. In: Proceedings of the ION GPS, pp 417–426

Wang J (1999) Stochastic modeling for real-time kinematic GPS/GLONASS positioning. Navigation 46(4):297–305

Wen W, Zhang G, Hsu LT (2019a) GNSS NLOS exclusion based on dynamic object detection using LiDAR point cloud. IEEE Trans Intell Transp Syst 22(2):853–862

Wen W, Zhang G, Hsu LT (2019b) Correcting NLOS by 3D LiDAR and building height to improve GNSS single point positioning. Navigation 66(4):705–718

Werner W, Winkel J (2003) TCAR and MCAR options with Galileo and GPS. In: Proceedings of the ION GPS/GNSS 2003, 9–12 September, Portland, OR, pp 790–800

Xu P (2001) Random simulation and GPS decorrelation. J Geod 75:408–423

Xu Y, Chen W (2018) Performance analysis of GPS/BDS dual/triple-frequency network RTK in urban areas: a case study in Hong Kong. Sensors 18(8):2437

Xu P, Shi C, Liu J (2012) Integer estimation methods for GPS ambiguity resolution: an applications oriented review and improvement. Surv Rev 44(324):59–71

Xu H, Angrisano A, Gaglione S, Hsu LT (2020) Machine learning based LOS/NLOS classifier and robust estimator for GNSS shadow matching. Satell Navig 1(1):1–12

Xu P, Cannon E, Lachapelle G (1995) Mixed integer programming for the resolution of GPS carrier phase ambiguities. Presented at IUGG95 Assembly, 2–14 July, Boulder, CO, USA

Yozevitch R, Moshe BB, Weissman A (2016) A robust GNSS LOS/NLOS signal classifier. Navigation 63(4):427–440

Zhang Z, Li B, He X, Zhang Z, Miao W (2020) Models, methods and assessment of four-frequency carrier ambiguity resolution for BeiDou-3 observations. GPS Solut 24:1–12

Zhou Y (2011) A new practical approach to GNSS high-dimensional ambiguity decorrelation. GPS Solut 15:325–331

Zhu Z, Hill N, Krebs A, Vinande E, Pontious J (2022) RTK and PPK solutions with a short baseline assisted with random sample consensus. In: Proceedings of the 35th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2022), pp 2800–2809

Zhu Z, Hindi A, Dickerson C, Vinande E, Pontious J (2023) Validation of low-cost RTK and PPK solutions assisted with random sample consensus in a GNSS-challenged environment. In: Proceedings of the 36th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2023), pp 2737–2748

Download references


This work was supported in part by the sponsorship of the University Grants Committee of Hong Kong under the scheme Research Impact Fund (Grant No. R5009-21), the Research Institute of Land and System, Hong Kong Polytechnic University, the National Natural Science Foundation of China (Grant No. 41974033, 42174025), and the Natural Science Foundation of Jiangsu Province (Grant No. BK20211569).

Author information

Authors and affiliations.

Department of Land Survey and Geo-Informatics, The Hong Kong Polytechnic University, Hong Kong, China

Qi Cheng, Wu Chen, Junhui Wang & Duojie Weng

College of Civil Aviation, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China

You can also search for this author in PubMed   Google Scholar


QC and WC designed this method; QC conducted the experiments, analysed the data, and drafted this manuscript; WC, RS, JHW,and DJW revised this manuscript.

Corresponding author

Correspondence to Rui Sun .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cheng, Q., Chen, W., Sun, R. et al. RANSAC-based instantaneous real-time kinematic positioning with GNSS triple-frequency signals in urban areas. J Geod 98 , 24 (2024). https://doi.org/10.1007/s00190-024-01833-6

Download citation

Received : 18 September 2023

Accepted : 07 March 2024

Published : 06 April 2024

DOI : https://doi.org/10.1007/s00190-024-01833-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Triple-frequency
  • Urban environments
  • Find a journal
  • Publish with us
  • Track your research

Generate accurate APA citations for free

  • Knowledge Base
  • APA Style 7th edition
  • How to write an APA methods section

How to Write an APA Methods Section | With Examples

Published on February 5, 2021 by Pritha Bhandari . Revised on June 22, 2023.

The methods section of an APA style paper is where you report in detail how you performed your study. Research papers in the social and natural sciences often follow APA style. This article focuses on reporting quantitative research methods .

In your APA methods section, you should report enough information to understand and replicate your study, including detailed information on the sample , measures, and procedures used.

Instantly correct all language mistakes in your text

Upload your document to correct all your mistakes in minutes


Table of contents

Structuring an apa methods section.


Example of an APA methods section

Other interesting articles, frequently asked questions about writing an apa methods section.

The main heading of “Methods” should be centered, boldfaced, and capitalized. Subheadings within this section are left-aligned, boldfaced, and in title case. You can also add lower level headings within these subsections, as long as they follow APA heading styles .

To structure your methods section, you can use the subheadings of “Participants,” “Materials,” and “Procedures.” These headings are not mandatory—aim to organize your methods section using subheadings that make sense for your specific study.

Note that not all of these topics will necessarily be relevant for your study. For example, if you didn’t need to consider outlier removal or ways of assigning participants to different conditions, you don’t have to report these steps.

The APA also provides specific reporting guidelines for different types of research design. These tell you exactly what you need to report for longitudinal designs , replication studies, experimental designs , and so on. If your study uses a combination design, consult APA guidelines for mixed methods studies.

Detailed descriptions of procedures that don’t fit into your main text can be placed in supplemental materials (for example, the exact instructions and tasks given to participants, the full analytical strategy including software code, or additional figures and tables).

Prevent plagiarism. Run a free check.

Begin the methods section by reporting sample characteristics, sampling procedures, and the sample size.

Participant or subject characteristics

When discussing people who participate in research, descriptive terms like “participants,” “subjects” and “respondents” can be used. For non-human animal research, “subjects” is more appropriate.

Specify all relevant demographic characteristics of your participants. This may include their age, sex, ethnic or racial group, gender identity, education level, and socioeconomic status. Depending on your study topic, other characteristics like educational or immigration status or language preference may also be relevant.

Be sure to report these characteristics as precisely as possible. This helps the reader understand how far your results may be generalized to other people.

The APA guidelines emphasize writing about participants using bias-free language , so it’s necessary to use inclusive and appropriate terms.

Sampling procedures

Outline how the participants were selected and all inclusion and exclusion criteria applied. Appropriately identify the sampling procedure used. For example, you should only label a sample as random  if you had access to every member of the relevant population.

Of all the people invited to participate in your study, note the percentage that actually did (if you have this data). Additionally, report whether participants were self-selected, either by themselves or by their institutions (e.g., schools may submit student data for research purposes).

Identify any compensation (e.g., course credits or money) that was provided to participants, and mention any institutional review board approvals and ethical standards followed.

Sample size and power

Detail the sample size (per condition) and statistical power that you hoped to achieve, as well as any analyses you performed to determine these numbers.

It’s important to show that your study had enough statistical power to find effects if there were any to be found.

Additionally, state whether your final sample differed from the intended sample. Your interpretations of the study outcomes should be based only on your final sample rather than your intended sample.

Write up the tools and techniques that you used to measure relevant variables. Be as thorough as possible for a complete picture of your techniques.

Primary and secondary measures

Define the primary and secondary outcome measures that will help you answer your primary and secondary research questions.

Specify all instruments used in gathering these measurements and the construct that they measure. These instruments may include hardware, software, or tests, scales, and inventories.

  • To cite hardware, indicate the model number and manufacturer.
  • To cite common software (e.g., Qualtrics), state the full name along with the version number or the website URL .
  • To cite tests, scales or inventories, reference its manual or the article it was published in. It’s also helpful to state the number of items and provide one or two example items.

Make sure to report the settings of (e.g., screen resolution) any specialized apparatus used.

For each instrument used, report measures of the following:

  • Reliability : how consistently the method measures something, in terms of internal consistency or test-retest reliability.
  • Validity : how precisely the method measures something, in terms of construct validity  or criterion validity .

Giving an example item or two for tests, questionnaires , and interviews is also helpful.

Describe any covariates—these are any additional variables that may explain or predict the outcomes.

Quality of measurements

Review all methods you used to assure the quality of your measurements.

These may include:

  • training researchers to collect data reliably,
  • using multiple people to assess (e.g., observe or code) the data,
  • translation and back-translation of research materials,
  • using pilot studies to test your materials on unrelated samples.

For data that’s subjectively coded (for example, classifying open-ended responses), report interrater reliability scores. This tells the reader how similarly each response was rated by multiple raters.

Report all of the procedures applied for administering the study, processing the data, and for planned data analyses.

Data collection methods and research design

Data collection methods refers to the general mode of the instruments: surveys, interviews, observations, focus groups, neuroimaging, cognitive tests, and so on. Summarize exactly how you collected the necessary data.

Describe all procedures you applied in administering surveys, tests, physical recordings, or imaging devices, with enough detail so that someone else can replicate your techniques. If your procedures are very complicated and require long descriptions (e.g., in neuroimaging studies), place these details in supplementary materials.

To report research design, note your overall framework for data collection and analysis. State whether you used an experimental, quasi-experimental, descriptive (observational), correlational, and/or longitudinal design. Also note whether a between-subjects or a within-subjects design was used.

For multi-group studies, report the following design and procedural details as well:

  • how participants were assigned to different conditions (e.g., randomization),
  • instructions given to the participants in each group,
  • interventions for each group,
  • the setting and length of each session(s).

Describe whether any masking was used to hide the condition assignment (e.g., placebo or medication condition) from participants or research administrators. Using masking in a multi-group study ensures internal validity by reducing research bias . Explain how this masking was applied and whether its effectiveness was assessed.

Participants were randomly assigned to a control or experimental condition. The survey was administered using Qualtrics (https://www.qualtrics.com). To begin, all participants were given the AAI and a demographics questionnaire to complete, followed by an unrelated filler task. In the control condition , participants completed a short general knowledge test immediately after the filler task. In the experimental condition, participants were asked to visualize themselves taking the test for 3 minutes before they actually did. For more details on the exact instructions and tasks given, see supplementary materials.

Data diagnostics

Outline all steps taken to scrutinize or process the data after collection.

This includes the following:

  • Procedures for identifying and removing outliers
  • Data transformations to normalize distributions
  • Compensation strategies for overcoming missing values

To ensure high validity, you should provide enough detail for your reader to understand how and why you processed or transformed your raw data in these specific ways.

Analytic strategies

The methods section is also where you describe your statistical analysis procedures, but not their outcomes. Their outcomes are reported in the results section.

These procedures should be stated for all primary, secondary, and exploratory hypotheses. While primary and secondary hypotheses are based on a theoretical framework or past studies, exploratory hypotheses are guided by the data you’ve just collected.

Are your APA in-text citations flawless?

The AI-powered APA Citation Checker points out every error, tells you exactly what’s wrong, and explains how to fix it. Say goodbye to losing marks on your assignment!

Get started!

proposed methodology

This annotated example reports methods for a descriptive correlational survey on the relationship between religiosity and trust in science in the US. Hover over each part for explanation of what is included.

The sample included 879 adults aged between 18 and 28. More than half of the participants were women (56%), and all participants had completed at least 12 years of education. Ethics approval was obtained from the university board before recruitment began. Participants were recruited online through Amazon Mechanical Turk (MTurk; www.mturk.com). We selected for a geographically diverse sample within the Midwest of the US through an initial screening survey. Participants were paid USD $5 upon completion of the study.

A sample size of at least 783 was deemed necessary for detecting a correlation coefficient of ±.1, with a power level of 80% and a significance level of .05, using a sample size calculator (www.sample-size.net/correlation-sample-size/).

The primary outcome measures were the levels of religiosity and trust in science. Religiosity refers to involvement and belief in religious traditions, while trust in science represents confidence in scientists and scientific research outcomes. The secondary outcome measures were gender and parental education levels of participants and whether these characteristics predicted religiosity levels.


Religiosity was measured using the Centrality of Religiosity scale (Huber, 2003). The Likert scale is made up of 15 questions with five subscales of ideology, experience, intellect, public practice, and private practice. An example item is “How often do you experience situations in which you have the feeling that God or something divine intervenes in your life?” Participants were asked to indicate frequency of occurrence by selecting a response ranging from 1 (very often) to 5 (never). The internal consistency of the instrument is .83 (Huber & Huber, 2012).

Trust in Science

Trust in science was assessed using the General Trust in Science index (McCright, Dentzman, Charters & Dietz, 2013). Four Likert scale items were assessed on a scale from 1 (completely distrust) to 5 (completely trust). An example question asks “How much do you distrust or trust scientists to create knowledge that is unbiased and accurate?” Internal consistency was .8.

Potential participants were invited to participate in the survey online using Qualtrics (www.qualtrics.com). The survey consisted of multiple choice questions regarding demographic characteristics, the Centrality of Religiosity scale, an unrelated filler anagram task, and finally the General Trust in Science index. The filler task was included to avoid priming or demand characteristics, and an attention check was embedded within the religiosity scale. For full instructions and details of tasks, see supplementary materials.

For this correlational study , we assessed our primary hypothesis of a relationship between religiosity and trust in science using Pearson moment correlation coefficient. The statistical significance of the correlation coefficient was assessed using a t test. To test our secondary hypothesis of parental education levels and gender as predictors of religiosity, multiple linear regression analysis was used.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Normal distribution
  • Measures of central tendency
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles


  • Cluster sampling
  • Stratified sampling
  • Thematic analysis
  • Cohort study
  • Peer review
  • Ethnography

Research bias

  • Implicit bias
  • Cognitive bias
  • Conformity bias
  • Hawthorne effect
  • Availability heuristic
  • Attrition bias
  • Social desirability bias

In your APA methods section , you should report detailed information on the participants, materials, and procedures used.

  • Describe all relevant participant or subject characteristics, the sampling procedures used and the sample size and power .
  • Define all primary and secondary measures and discuss the quality of measurements.
  • Specify the data collection methods, the research design and data analysis strategy, including any steps taken to transform the data and statistical analyses.

You should report methods using the past tense , even if you haven’t completed your study at the time of writing. That’s because the methods section is intended to describe completed actions or research.

In a scientific paper, the methodology always comes after the introduction and before the results , discussion and conclusion . The same basic structure also applies to a thesis, dissertation , or research proposal .

Depending on the length and type of document, you might also include a literature review or theoretical framework before the methodology.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). How to Write an APA Methods Section | With Examples. Scribbr. Retrieved April 2, 2024, from https://www.scribbr.com/apa-style/methods-section/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, how to write an apa results section, apa format for academic papers and essays, apa headings and subheadings, scribbr apa citation checker.

An innovative new tool that checks your APA citations with AI software. Say goodbye to inaccurate citations!

  • Verra Registry
  • Methodologies
  • Validation and Verification
  • Verra Views
  • Program Notices
  • Webinar Recordings
  • Consultations
  • Press Coverage
  • Opportunities

proposed methodology

  • Program Details

proposed methodology

Consultation: Proposed Minor Revision to IALM Methodology 

Crop seedling growing in a field of dirt.

Photo by Roman Synkevych via Unsplash.

Verra is launching a public consultation for a minor revision (PDF) (methodology development ID #M0268) to VM0042 Methodology for Improved Agricultural Land Management, v2.0 in the Verified Carbon Standard (VCS) Program . With this proposed revision, project proponents will be able to differentiate between Verified Carbon Units (VCUs) based on carbon dioxide removals and VCUs based on greenhouse gas emission reductions.

VM0042 includes the quantification of both emission reductions and removals. However, in its current form, it does not allow proponents to differentiate between these outcomes and quantify the exact number of VCUs resulting from reduction or removal activities, respectively. Verra is therefore proposing updates to the equation in Section 8 of VM0042 .

The revision would also enable project proponents using VM0042 to apply the appropriate mitigation outcome label to VCUs generated by their projects. The mitigation outcome label, which was released in August 2023 as part of updates to the VCS Program, is a new market label for VCUs. Please see the Mitigation Outcome Type Labels Guidance, v1.2 (PDF) for more information.

The proposed revision includes the corrections and clarifications issued in January 2024 .

To provide feedback on the proposed revisions, please complete the M0268 VM0042 Comment Template (xlsx) by May 6, 2024 .

Help | Advanced Search

Computer Science > Machine Learning

Title: red teaming gpt-4v: are gpt-4v safe against uni/multi-modal jailbreak attacks.

Abstract: Various jailbreak attacks have been proposed to red-team Large Language Models (LLMs) and revealed the vulnerable safeguards of LLMs. Besides, some methods are not limited to the textual modality and extend the jailbreak attack to Multimodal Large Language Models (MLLMs) by perturbing the visual input. However, the absence of a universal evaluation benchmark complicates the performance reproduction and fair comparison. Besides, there is a lack of comprehensive evaluation of closed-source state-of-the-art (SOTA) models, especially MLLMs, such as GPT-4V. To address these issues, this work first builds a comprehensive jailbreak evaluation dataset with 1445 harmful questions covering 11 different safety policies. Based on this dataset, extensive red-teaming experiments are conducted on 11 different LLMs and MLLMs, including both SOTA proprietary models and open-source models. We then conduct a deep analysis of the evaluated results and find that (1) GPT4 and GPT-4V demonstrate better robustness against jailbreak attacks compared to open-source LLMs and MLLMs. (2) Llama2 and Qwen-VL-Chat are more robust compared to other open-source models. (3) The transferability of visual jailbreak methods is relatively limited compared to textual jailbreak methods. The dataset and code can be found here https://anonymous.4open.science/r/red_teaming_gpt4-C1CE/README.md .

Submission history

Access paper:.

  • HTML (experimental)
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .


  1. 15 Research Methodology Examples (2023)

    proposed methodology

  2. presents the framework of the proposed methodology. It consists of

    proposed methodology

  3. Write a Methodology in Research Proposal Example

    proposed methodology

  4. Proposed framework of research methodology.

    proposed methodology

  5. What should the research proposal process look like?

    proposed methodology

  6. The 7 steps of the proposed methodology

    proposed methodology



  2. CPUC Webinar

  3. SDG Synthesis Coalition

  4. Developing a Research Proposal

  5. Part 1: Designing the Methodology

  6. Creating a research proposal


  1. What Is a Research Methodology?

    Learn what a research methodology is and how to write one for your thesis, dissertation, or research paper. Find out how to explain your methodological approach, data collection methods, and analysis method.

  2. What is Research Methodology? Definition, Types, and Examples

    Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...

  3. Research Methodology

    Qualitative Research Methodology. This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.

  4. What Is Research Methodology? Definition + Examples

    Research methodology is the practical "how" of a research study, covering the type of data, the sampling strategy, the data collection and analysis methods. Learn the plain-language explanation and examples of research methodology with Grad Coach, a platform for academic research support.

  5. Your Step-by-Step Guide to Writing a Good Research Methodology

    Learn what research methodology is, why it is important, and how to write a good one. Find out the basic structure of a research methodology and the instruments you can use to conduct your study.

  6. How to Write Research Methodology in 2024: Overview, Tips, and

    Learn what is research methodology, how to choose the right methods for your study, and how to write a clear and effective methodology section for your research paper. This article covers the basics of quantitative, qualitative, and mixed methods, as well as their advantages and disadvantages.

  7. 6. The Methodology

    Your methods for gathering data should have a clear connection to your research problem. In other words, make sure that your methods will actually address the problem. One of the most common deficiencies found in research papers is that the proposed methodology is not suitable to achieving the stated objective of your paper.

  8. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  9. Research Methods

    Learn how to choose and apply research methods for collecting and analyzing data. Compare qualitative and quantitative, primary and secondary, descriptive and experimental methods with examples and pros and cons.

  10. How To Write The Methodology Chapter

    Do yourself a favour and start with the end in mind. Section 1 - Introduction. As with all chapters in your dissertation or thesis, the methodology chapter should have a brief introduction. In this section, you should remind your readers what the focus of your study is, especially the research aims. As we've discussed many times on the blog ...

  11. How To Write A Research Methodology In 4 Steps

    Writing a research methodology is an essential part of any research proposal, but it can be difficult to know where to start. In this article, we'll walk you through a simple 4 step process for ...

  12. Research Methodology Example (PDF + Template)

    Research Methodology Example. Detailed Walkthrough + Free Methodology Chapter Template. If you're working on a dissertation or thesis and are looking for an example of a research methodology chapter, you've come to the right place. In this video, we walk you through a research methodology from a dissertation that earned full distinction ...

  13. How To Write a Methodology (With Tips and FAQs)

    Here are the steps to follow when writing a methodology: 1. Restate your thesis or research problem. The first part of your methodology is a restatement of the problem your research investigates. This allows your reader to follow your methodology step by step, from beginning to end. Restating your thesis also provides you an opportunity to ...

  14. How to Write Research Methodology: 13 Steps (with Pictures)

    A quantitative approach and statistical analysis would give you a bigger picture. 3. Identify how your analysis answers your research questions. Relate your methodology back to your original research questions and present a proposed outcome based on your analysis.

  15. How to Write a Research Methodology in 4 Steps

    Learn how to write a strong methodology chapter that allows readers to evaluate the reliability and validity of the research. A good methodology chapter incl...

  16. How to Write a Dissertation Proposal

    Table of contents. Step 1: Coming up with an idea. Step 2: Presenting your idea in the introduction. Step 3: Exploring related research in the literature review. Step 4: Describing your methodology. Step 5: Outlining the potential implications of your research. Step 6: Creating a reference list or bibliography.

  17. Q: How do I write the methods section of a research proposal?

    The methods section of a research proposal contains details about how you will conduct your research, such as your study design, methods, and work plan. It must be clear, concise, and convincing to the funding agency and other researchers. Learn the purpose, format, and tips for writing this section from Editage Insights.

  18. How to Write a Research Proposal

    A research proposal describes what you will investigate, why it's important, and how you will conduct your research. Learn the purpose, format, structure, and examples of a research proposal for different fields and purposes. Download a free template and get feedback on your paper.

  19. How to write a methodology in 8 steps (definition and types)

    Here are eight key steps to writing a methodology: 1. Restate your thesis or research problem. The first step to writing an effective methodology requires that you restate your initial thesis. It's an important step that allows the reader to remember the most important aspects of your research and follow each step of your methodology.

  20. Proposal

    Proposal. Definition: Proposal is a formal document or presentation that outlines a plan, idea, or project and seeks to persuade others to support or adopt it. Proposals are commonly used in business, academia, and various other fields to propose new initiatives, solutions to problems, research studies, or business ventures.

  21. How To Choose The Right Research Methodology

    To choose the right research methodology for your dissertation or thesis, you need to consider three important factors. Based on these three factors, you can decide on your overarching approach - qualitative, quantitative or mixed methods. Once you've made that decision, you can flesh out the finer details of your methodology, such as the ...

  22. How To Write A Proposal

    IV. Proposed Solution or Project Description: [Present your proposed solution or project in a clear and detailed manner. Explain how it addresses the problem and why it is the most effective approach. Highlight any unique features or advantages.] V. Methodology: [Describe the step-by-step approach or methodology you will use to implement your ...

  23. Water

    The present work presents a methodology based on the use of stochastic weather generators (WGs) for the estimation of high-return-period floods under climate change scenarios. Applying the proposed methodology in a case study, Rambla de la Viuda (Spain), satisfactory results were obtained through the regionalization of the bias-corrected EUROCORDEX climate projections and the integration of ...

  24. A scoping review of theories, models and frameworks used or proposed to

    To survey the available knowledge on evaluation practices for KMb strategies, we conducted a scoping review. According to Munn et al. [], a scoping review is indicated to identify the types of available evidence and knowledge gaps, to clarify concepts in the literature and to identify key characteristics or factors related to a concept.This review methodology also allows for the inclusion of a ...

  25. 60-Day Notice of Proposed Information Collection: Border Crossing

    Evaluate whether the proposed information collection is necessary for the proper functions of the Department. Evaluate the accuracy of our estimate of the time and cost burden for this proposed collection, including the validity of the methodology and assumptions used. Enhance the quality, utility, and clarity of the information to be collected.

  26. RANSAC-based instantaneous real-time kinematic positioning ...

    Then, multi-feature machine learning-based methods have been proposed to address the unreliability of using only C/N 0 or elevation angle detection and isolation (Yozevitch et al. 2016; Hsu 2017; Sun et al. 2020; Xu et al. 2020; Suzuki and Amano 2021). That work has shown that the key to improving the performance of machine learning-based ...

  27. How to Write an APA Methods Section

    The main heading of "Methods" should be centered, boldfaced, and capitalized. Subheadings within this section are left-aligned, boldfaced, and in title case. You can also add lower level headings within these subsections, as long as they follow APA heading styles. To structure your methods section, you can use the subheadings of ...

  28. Consultation: Proposed Minor Revision to IALM Methodology

    Verra is launching a public consultation for a minor revision (PDF) (methodology development ID #M0268) to VM0042 Methodology for Improved Agricultural Land Management, v2.0 in the Verified Carbon Standard (VCS) Program.With this proposed revision, project proponents will be able to differentiate between Verified Carbon Units (VCUs) based on carbon dioxide removals and VCUs based on greenhouse ...

  29. Red Teaming GPT-4V: Are GPT-4V Safe Against Uni/Multi-Modal Jailbreak

    Various jailbreak attacks have been proposed to red-team Large Language Models (LLMs) and revealed the vulnerable safeguards of LLMs. Besides, some methods are not limited to the textual modality and extend the jailbreak attack to Multimodal Large Language Models (MLLMs) by perturbing the visual input. However, the absence of a universal evaluation benchmark complicates the performance ...

  30. Notice of Proposed Rulemaking Title 27, California Code of Regulations

    Office of Environmental Health Hazard Assessment proposes to amend Article 6 of Title 27 of the California Code of Regulations, section 25607.2. This would be in addition to the applicable safe harbor warnings that already apply to such exposures under existing law. The warning content and methods provided in the safe harbor regulations are deemed "clear and reasonable" by