Extension Educators


8/12/2022 -- Be sure to MATCH "method of delivery" and "contacts" in Learning Events 

  • If "in person" or "via technology - live" -- contacts are DIRECT (only) (see image)
  • If "Take-Home Kit/Program" or "via technology - online course / Brightspace" or "via technology recorded/posted" -- document INDIRECT contacts with the data analytics on the reach or number of viewers from the technology used. (see image)
  • If combined METHODS OF DELIVERY -- both DIRECT and INDIRECT are to be reported -- but be sure you have selected each of the delivery methods used.
    • For example -- in person (direct contacts) AND via technology recorded/posted (indirect contacts)

8/1/2022 -- Extension - Summary Report -- a summary of the Learning Events, Other Activities, and Impact Statements screens. This creates a WORD doc. It is ready for Educators to run for their annual review (date range is Sept. 1, 2021-Aug. 31, 2022). Instructions are here.

5/9/22 -- DM Updates - Statewide Area Meetings - Slides

1/3/2022 -- Demographics: https://extension.purdue.edu/hub/12-22-new-demographic-data-collection-for-purdue-extension/. 

  • Educators will need to make time to review this new approach to collecting demographics and move toward incorporating it into program activities. 
  • Those leading statewide, signature, or multi-site programs are to update, coordinate, and communicate data entry for consistency.


Please let us know if you have questions or if we can be of assistance. Email DMhelp@lists.purdue.edu. Thank you.

Looking for HELP nearby? MAP of DM Liaisons

We have a group of Educators who are "DM Liaisons" located in each Extension Area across the state. They have received additional training and are available to assist Educators with questions. Check with your Area DM Liaisons for help.


Area 7 – Amy Alka, Randolph, Kyli Penrod, Blackford
Area 11 – Caroline Everidge, Huntington, Molly Hoag, Wells


Area 6 – Sarah Hanson, Johnson; Scott Gabbard, Shelby
Area 8 – Adam Shanks, Clinton, Open _____


Area 9 – Monica Nagele, Montgomery, Open _____
Area 10 – Mary Foell, LaPorte, Jennifer Haynes, Lake


Area 1 – Christopher Fogle, Decatur, Open _____
Area 2 – Megan Broughton, Washington, Gina Anderson, Floyd


Area 3 – Valerie Clingerman, Knox, Sara Dzimianski, Perry
Area 5 – Jenna Nees, Putnam; Bob Bruner, Owen

Are YOU interested in filling our open spots for DM Liaison? Talk to your Area Director and let them know you are interested or email DMhelp@lists.purdue.edu.



  • Several reports are generated from data in DM.
    • USDA NIFA Annual Accomplishment Report
    • Purdue Extension Annual Report
    • Purdue Fundamental and Applied Research and Extension Showcase
    • Quarterly Report for Extension Educators
    • Quarterly Highlight for Extension Specialists

Find the Extension Reports on the Purdue Extension website "about us" page.


DM Newly Built Structure for Reporting

Critical Issues

  • Check out this document summarizing the Critical Issues.

CHECKLISTS on the DM Learning Events & Impact Statements screens

  • Check out this document summarizing the CHECKLISTS of new Purdue University and College of Agriculture current issues, priorities and initiatives.

4-H Themes and Other Activities Screen Updates

  • Check out this document about the new 4-H themes that will be used on the upcoming Learning Events and Impact Statement Screens.
  • Check out this document to see the structure of the Other Activities screen which has been simplified and reduced.

Purdue Extension GOALS

  • This document shows the goals and priorities for Extension programs.


Workforce Development (WFD) - Critical Issues, Logic Model, and Evaluation Guide / Question Bank

  • Check out these WFD articles: Logic Model & Evaluation Guide; WFD Program Areas & ANR
  • In case you missed it, watch the WFD webinar series recordings.
  • Here is the new WFD logic model.
  • Refer to the Evaluation Guide and Question Bank for creating survey questions, measuring outcomes, and reporting in DM.
  • Reporting outcomes in DM organized in 2 sections
    • Section 1 - WFD - Workforce Development for all Program Areas
    • Section 2 - WFD - Private and Commercial Pesticide Recertification programs (For ANR only)
  • Check with your Program Leader if you have questions about programs that are considered WFD. Currently, these programs are recognized as WFD:
    • 4-H
      • IN Work / Work Ready
      • Clover Gaming
    • ANR
      • Private Applicator Certification
      • Private Applicator Recertification Program (PARP)
      • Commercial Applicator Certification
      • Commercial Applicator Recertification Program
      • UAV Signature Program
    • CD
      • Work Ready
      • Remote Work
      • FORWARD
    • HHS
      • ServSafe


USDA Guidance and Purdue Extension Instructions for
Demographic Data Collection.



HELP TIPS: When in Digital Measures, click on the question mark next to each data field. You will see "help tips" for entering your data.


Learning Events

  • Instructions - download PDF
    • Learning Event Information
    • Log in
    • Learning Event Details
      • Month / Year
      • Title
      • Event County
      • Method of Delivery
      • Technology
      • Session, Minutes, Youth / Adult Learners (Direct Contacts)
      • Unique Individuals
      • Self-Reported Demographics
      • Youth Demographics
      • Adult Demographics
      • Indirect Contacts
      • Keywords
    • Involvement
    • Evaluation / Outcomes
      • Evaluation Plan or Method
      • Number of Participants Who Completed the Evaluation
      • Outcomes - Logic Models, Templates, Evaluation Guides and Question Banks
      • Purdue Extension Logic Model
      • Purdue Research Logic Model
      • Workforce Development Logic Model and Evaluation Guide and Question Bank
        • Section 1 WFD - Workforce Development (for all Program Areas)
        • Section 2 WFD - Private and Commercial Pesticide Recertification Programs (for ANR only)
      • Program Area Outcomes and Logic Models
        • ANR
        • CD
        • HHS
        • 4-H
    • NIFA / Purdue & CoA / Purdue Extension
      • NIFA
        • NIFA Critical Issues
        • NIFA Extension / Research
      • Purdue University & College of Agriculture
        • Key Initiatives
        • Current Issues
        • Diversity, Equity and Inclusion (DEI) Priorities
        • Commercialization Priorities
      • Purdue Extension
          • Primary and Secondary Program Area Themes
          • Purdue Extension Goals


Other Activities


Impact Statements

Since Digital Measures does not save your data as you go, and since it can also “time out” if you leave it open and idle for some time, a best practice here would be to create, edit, finalize, and save your impact statement in WORD, then cut and paste it into DM.

Issue (Who Cares and Why)

  • In about three sentences, state the issue or problem addressed by the Extension program/project.
  • Look to the situation statement of the logic model – the ISSUE comes from that information.
  • Describe the problem, need, concern, or situation. Examples of issues mayinclude:obesity,drought,lackofleadershipknowledgeorskills,ortheneedforstrongerscience
  • Explain the relevance of this issue. Why is it important?
  • Share any needs assessment data you have gathered to indicate the prevalence or importance of this issue.
  • Introduce any statistics that may illustrate the problem/issue in the state or among the population.

What Has Been Done (Describe the program/project)

  • In about three to five sentences, describe what you or your team did.
  • Give the title of the program/project.
  • Describe the delivery or implementation, include quantity of activities (e.g., four-session weekly series; six communities).
  • Indicated that topics that were presented
  • Look to the inputs and outputs section of the logic model – WHAT HAS BEEN DONE comes from that information.
  • Avoidusingacronyms,abbreviations,and
  • Write as if you are explaining the program/project to someone who doesn’t know anything about

Who Were the Participants (Describe the program learners/attendees by roles, numbers and demographics.)

  • Who was the audience (aka learners)?
  • Describe the audience by their roles (e.g., high school youth, childcare providers, parents, farmers, community leaders, agency representatives, land owners).
  • Look to the outputs section of the logic model – details for this section can come from that.
  • How many Youth and/or how many Adults attended? Give the unique number of program participants.
  • Provide participant self-reported demographics information – gender, ethnicity, race and age from evaluation survey, 4-H Online, Survey Builder, Common Measures 2.0 surveys, or CVENT/Salesforce.
  • Provide the total number of learners (youth and/or adult) who completed the evaluation, if applicable.


  • This is a description about what changed because of the program.
  • Share results from the program from the perspective of the audience.
    • What did they learn? Knowledge, attitudes, skills, aspirations.
    • What practices did they adopt or behaviors did they change?
    • How did they benefit from those practices or behaviors?
  • Look to the Outcomes-Impact section of the logic model. Use those to help you create a narrative of the results.
  • Include numbers or percentages to report your evaluation. Include economic indicators if appropriate.
  • As appropriate, combine quantitative data (e.g., number, percentage, dollars, etc.) and qualitative data (e.g., anecdotes/narratives or quotes from participants on program evaluation. (Don’t share names).
  • Describe the difference your program made for the people of Indiana, and the communities, families, youth, businesses, environment, etc.


Purdue Fast Start Program (Counties Reporting Engagement)

The Purdue Fast Start Program screen in DM is for tracking engagements. Counties may determine how to best document and post in DM. Individual Educators may report engagements. Some Counties may coordinate and have one or two people enter data. Some Counties may have multiple people reporting. Just be certain each County is reporting engagement activities every month.

The Purdue Fast Start Program screen is unique and has a different approach to data entry compared to our other DM screens. Most importantly, we are “counting” IMMEDIATE CONTACTS (not the Direct and Indirect Contacts we report on our other Extension screens). Please, read carefully before entering data.

Purdue Fast Start Program INSTRUCTIONS

NEW! Data Entry for 2021 County Fairs

Do you have questions about documenting Purdue's Fast Start Program? Please email DMhelp@lists.purdue.edu.


Running a DM Report

  • To run reports of the data entered, follow the instructions here.


Need more information on OUTCOMES?

Outcome indicators are statements created from Outcomes/Impacts posted on program area logic models. These indicators are written in broad concepts to capture a limited list of key results of Extension efforts. The approach to outcome indicators varies between program areas based on the structure and coordination of programs.

  • Short-term outcomes focus on knowledge, attitudes, skills, and aspirations. We often measure the knowledge gained or intentions to take action.
  • Medium-term outcomes are measured after some time has passed. Measures focus on participants taking action, changing behavior, or adopting practices or skills that were learned in the program.
  • Long-term outcomes are looking to changes that result from the actions/behaviors/practices/skills  taken. Measures are for condition, environmental, economic, civic or social changes.

Looking for LOGIC MODELS?


Reporting Expectations for Extension Educators

Schedule for reporting

  • September 1 is the deadline for your annual review process.
  • By the 5th business day, report your monthly:
    • Communiqué to your District Director, and
      • Learning Events – education programs/workshops delivered to the public in person or via technology
      • Other Activities – consultations, community committees/boards/coalitions, popular press (educational articles, newsletters, TV/Radio spots), professional publications
  • Deadlines are May 1 and November 1 to report IMPACT STATEMENTS, (narratives sharing the success of a program) a minimum of two reported each year in DM (individual and/or team as appropriate to your role).

Monthly Communiqué or DM?

Educators do both each month. Use the communiqué for reporting activities toward your goals. Then, go to DM to enter outputs (on the learning events and other activities screens).

Targeted Set of Metrics in DM

While we appreciate the totality of Educators’ efforts, in DM we do not try to capture everything Educators do. Our approach is to collect a targeted set of metrics focused on key activities.

However, the communiqué can be more flexible than DM, with a narrative for documenting progress and activities, and marketing and other activities as needed for Educators.

Also, CED administrative tasks will go in the communiqué only, not in DM.

Lastly, please remember that OUTPUTS are the first step. We also need to report OUTCOMES and IMPACTS which are the most important information we share. We must be quick and efficient in recording outputs monthly so we may focus on our goal of producing outcomes and impacts.

Program Planning vs. Metrics

To compare the Educators' monthly communiqué and DM, in general, you report progress of your efforts and all that you do in the planning process in the monthly communiqué. Then when you implement your program (learning event), you report the metrics. See this diagram.


Data - Continuous Quality Improvement

The purpose of the continuous quality improvement program for Digital Measures data includes several activities:
> consistent communication, instructions, and training on how to enter data in DM,
> monthly and quarterly bulk data review and analysis for issues in entered data, with Educator and leadership communication for updates/corrections,
> monthly administrative reviews by leadership for Areas and Program Areas, with Educator communication and follow-up for updates/corrections, and
> annual audit of data with five Educators.

  1. Consistent communication, instructions and tips on how to enter data in DM
    • Julie Huetteman communicates with ELT and our DM Liaisons as new or updated information is available to keep all informed of the current issues. Area Directors and Program Leaders share information through their communication channels, newsletters and emails. DM Liaisons also share information with their Areas at meetings and as appropriate.
    • Instruction documents are embedded on the DM screens so that information is at your fingertips when entering data in DM. Within the instruction documents are URLs linking to more specific information and resources.
    • On the DM screens, the Help Tips question marks have been updated along with the instructions so that each data field has a popup window to provide a description and some instructions. Just clicking on the DM Help Tips question mark next to the data field will bring this popup window into view.
    • DM Office Virtual Hours via ZOOM are made available periodically for Extension personnel to drop in to ask questions.
  2. Quarterly data analysis for issues in reported data
    • Each quarter, administrative reports are run to assess the data. As issues are identified, communication is shared with the Educators and their Area Directors and Program Leaders for reviewing and making corrections or improvements in the data.
  3. Monthly review by Area Directors and Program Leaders with Educator follow-up
    • Extension leadership reviews the data entered by their staff to check for regular and consistent reporting, and for accuracy and completeness of the data. When Area Directors and Program Leaders find discrepancies, they communicate with Educators to assess the information and work toward improved accuracy as appropriate.
  4. Annual audit of data for five Educators
    • Five Educators (one from each District) are randomly selected to show how they documented and tracked the data they entered in Digital Measures. The audit process is a review of the data entered on three screens: 1) Learning Events, 2) Other Activities, and 3) Impact Statements. Educators will be asked to share their process for gathering data and the documentation that they have for what they entered into DM. In May and June, Julie Huetteman, will communicate with five Educators and their Area Director and Program Leaders about scheduling this 1-hour appointment as fits into the Educator calendar in the upcoming months.

Questions? Please email Julie Huetteman, jhuettem@purdue.edu.