Extension Educators

  • ANNOUNCEMENTS:
    • Please let us know if you have questions or if we can be of assistance. Email DMhelp@lists.purdue.edu. Thank you.

NOTICE: During the DM rebuilding process, all current  Extension and USDA screens are hidden. You are not able to enter any data. DM reports are available.

 

 

Digital Measures (DM) Office Hours via ZOOM

No scheduled office hours at this time.
Email DMhelp@lists.purdue.edu for assistance.

 

PURDUE FAST START PROGRAM - Counties Reporting Engagement

The Purdue Fast Start Program screen in DM is for tracking engagements. Counties may determine how to best document and post in DM. Individual Educators may report engagements. Some Counties may coordinate and have one or two people enter data. Some Counties may have multiple people reporting. Just be certain each County is reporting engagement activities every month.

The Purdue Fast Start Program screen is unique and has a different approach to data entry compared to our other DM screens. Most importantly, we are “counting” IMMEDIATE CONTACTS (not the Direct and Indirect Contacts we report on our other Extension screens). Please, read carefully before entering data.

Purdue Fast Start Program INSTRUCTIONS

NEW! Data Entry for 2021 County Fairs

Do you have questions about documenting Purdue's Fast Start Program? Please email DMhelp@lists.purdue.edu.

 

=================================

 

Reporting Expectations for Extension Educators

Schedule for reporting

  • September 1 is the deadline for your annual review process.
  • By the 5th business day, report your monthly:
    • Communiqué to your District Director, and
    • LEARNING EVENTS and OTHER ACTIVITIES in DM.
      • Learning Events – education programs/workshops delivered to the public in person or via technology
      • Other Activities – consultations, community committees/boards/coalitions, popular press (educational articles, newsletters, TV/Radio spots), professional publications
  • Deadlines are May 1 and November 1 to report IMPACT STATEMENTS, (narratives sharing the success of a program) a minimum of two reported each year in DM (individual and/or team as appropriate to your role).

Monthly Communiqué or DM?

Educators do both each month. Use the communiqué for reporting activities toward your goals. Then, go to DM to enter outputs (on the learning events and other activities screens).

Targeted Set of Metrics in DM

While we appreciate the totality of Educators’ efforts, in DM we do not try to capture everything Educators do. Our approach is to collect a targeted set of metrics focused on key activities.

However, the communiqué can be more flexible than DM, with a narrative for documenting progress and activities, and marketing and other activities as needed for Educators.

Also, CED administrative tasks will go in the communiqué only, not in DM.

Lastly, please remember that OUTPUTS are the first step. We also need to report OUTCOMES and IMPACTS which are the most important information we share. We must be quick and efficient in recording outputs monthly so we may focus on our goal of producing outcomes and impacts.

Program Planning vs. Metrics

To compare the Educators' monthly communiqué and DM, in general, you report progress of your efforts and all that you do in the planning process in the monthly communiqué. Then when you implement your program (learning event), you report the metrics. See this diagram.

=================================

 

DM Newly Built Structure for Reporting

NEW! Looking for more information about the NIFA Critical Issues?

  • Check out this document summarizing the Critical Issues.

 

NEW! Review the CHECKLISTS on the DM Learning Events & Impact Statements screens

  • Check out this document summarizing the CHECKLISTS of new Purdue University and College of Agriculture current issues, priorities and initiatives.

 

NEW! 4-H Themes and Other Activities Screen Updates

  • Check out this document about the new 4-H themes that will be used on the upcoming Learning Events and Impact Statement Screens.
  • Check out this document to see the structure of the Other Activities screen which has been simplified and reduced.

 

Need to know more about Purdue Extension GOALS?

  • This document shows the goals and priorities for Extension programs.

 

Learning Events, Other Activities, Impact Statements

  • HELP TIPS: When in Digital Measures, click on the question mark next to each data field. You will see "help tips" for entering your data.

 

 

Running a DM Report

  • To run reports of the data entered, follow the instructions here.

=================================

Resources

Need help writing an impact statement?

Since Digital Measures does not save your data as you go, and since it can also “time out” if you leave it open and idle for some time, a best practice here would be to create, edit, finalize, and save your impact statement in WORD, then cut and paste it into DM.

Issue (Who Cares and Why)

  • In about three sentences, state the issue or problem addressed by the Extension program/project.
  • Look to the situation statement of the logic model – the ISSUE comes from that information.
  • Describe the problem, need, concern, or situation. Examples of issues mayinclude:obesity,drought,lackofleadershipknowledgeorskills,ortheneedforstrongerscience
  • Explain the relevance of this issue. Why is it important?
  • Share any needs assessment data you have gathered to indicate the prevalence or importance of this issue.
  • Introduce any statistics that may illustrate the problem/issue in the state or among the population.

What Has Been Done (Describe the program/project)

  • In about three to five sentences, describe what you or your team did.
  • Give the title of the program/project.
  • Describe the delivery or implementation, include quantity of activities (e.g., four-session weekly series; six communities).
  • Indicated that topics that were presented
  • Look to the inputs and outputs section of the logic model – WHAT HAS BEEN DONE comes from that information.
  • Avoidusingacronyms,abbreviations,and
  • Write as if you are explaining the program/project to someone who doesn’t know anything about

Who Were the Participants (Describe the program learners/attendees by roles, numbers and demographics.)

  • Who was the audience (aka learners)?
  • Describe the audience by their roles (e.g., high school youth, childcare providers, parents, farmers, community leaders, agency representatives, land owners).
  • Look to the outputs section of the logic model – details for this section can come from that.
  • How many Youth and/or how many Adults attended? Give the unique number of program participants.
  • Provide participant self-reported demographics information – gender, ethnicity, race and age from evaluation survey, 4-H Online, Survey Builder, Common Measures 2.0 surveys, or CVENT/Salesforce.
  • Provide the total number of learners (youth and/or adult) who completed the evaluation, if applicable.

Results

  • This is a description about what changed because of the program.
  • Share results from the program from the perspective of the audience.
    • What did they learn? Knowledge, attitudes, skills, aspirations.
    • What practices did they adopt or behaviors did they change?
    • How did they benefit from those practices or behaviors?
  • Look to the Outcomes-Impact section of the logic model. Use those to help you create a narrative of the results.
  • Include numbers or percentages to report your evaluation. Include economic indicators if appropriate.
  • As appropriate, combine quantitative data (e.g., number, percentage, dollars, etc.) and qualitative data (e.g., anecdotes/narratives or quotes from participants on program evaluation. (Don’t share names).
  • Describe the difference your program made for the people of Indiana, and the communities, families, youth, businesses, environment, etc.

Need more information on OUTCOMES?

Outcome indicators are statements created from Outcomes/Impacts posted on program area logic models. These indicators are written in broad concepts to capture a limited list of key results of Extension efforts. The approach to outcome indicators varies between program areas based on the structure and coordination of programs.

  • Short-term outcomes focus on knowledge, attitudes, skills, and aspirations. We often measure the knowledge gained or intentions to take action.
  • Medium-term outcomes are measured after some time has passed. Measures focus on participants taking action, changing behavior, or adopting practices or skills that were learned in the program.
  • Long-term outcomes are looking to changes that result from the actions/behaviors/practices/skills  taken. Measures are for condition, environmental, economic, civic or social changes.

 

Looking for LOGIC MODELS?

Looking for HELP nearby?

We have a group of Educators who are "DM Liaisons" located in each Extension Area across the state. They have received additional training and are available to assist Educators with questions. Check with your Area DM Liaisons for help.

Are YOU interested in filling our open spots for DM Liaison? Talk to your Area Director and let them know you are interested or email DMhelp@lists.purdue.edu.

MAP of DM Liaisons

East

Area 7 – Amy Alka, Randolph, Open _____
Area 11 – Caroline Everidge, Huntington, Molly Hoag, Wells

Central

Area 6 – Sarah Hanson, Johnson; Scott Gabbard, Shelby
Area 8 – Adam Shanks, Clinton, Open _____

Northwest

Area 9 – Monica Nagele, Montgomery, Open _____
Area 10 – Mary Foell, LaPorte, Open _____

Southeast

Area 1 – Harriet Armstrong, Bartholomew, Open _____
Area 2 – Megan Broughton, Washington, Gina Anderson, Floyd

Southwest

Area 3 – Hans Schmitz, Posey, Valerie Clingerman, Knox, Sara Dzimianski, Perry
Area 5 – Jenna Nees, Putnam; Jay Christiansen, Vigo

 

=================================

Data - Continuous Quality Improvement

The purpose of the continuous quality improvement program for Digital Measures data includes several activities:
> consistent communication, instructions, and training on how to enter data in DM,
> monthly and quarterly bulk data review and analysis for issues in entered data, with Educator and leadership communication for updates/corrections,
> monthly administrative reviews by leadership for Areas and Program Areas, with Educator communication and follow-up for updates/corrections, and
> annual audit of data with five Educators.

  1. Consistent communication, instructions and tips on how to enter data in DM
    • Julie Huetteman communicates with ELT and our DM Liaisons as new or updated information is available to keep all informed of the current issues. Area Directors and Program Leaders share information through their communication channels, newsletters and emails. DM Liaisons also share information with their Areas at meetings and as appropriate.
    • Instruction documents are embedded on the DM screens so that information is at your fingertips when entering data in DM. Within the instruction documents are URLs linking to more specific information and resources.
    • On the DM screens, the Help Tips question marks have been updated along with the instructions so that each data field has a popup window to provide a description and some instructions. Just clicking on the DM Help Tips question mark next to the data field will bring this popup window into view.
    • DM Office Virtual Hours via ZOOM are made available periodically for Extension personnel to drop in to ask questions.
  2. Quarterly data analysis for issues in reported data
    • Each quarter, administrative reports are run to assess the data. As issues are identified, communication is shared with the Educators and their Area Directors and Program Leaders for reviewing and making corrections or improvements in the data.
  3. Monthly review by Area Directors and Program Leaders with Educator follow-up
    • Extension leadership reviews the data entered by their staff to check for regular and consistent reporting, and for accuracy and completeness of the data. When Area Directors and Program Leaders find discrepancies, they communicate with Educators to assess the information and work toward improved accuracy as appropriate.
  4. Annual audit of data for five Educators
    • Five Educators (one from each District) are randomly selected to show how they documented and tracked the data they entered in Digital Measures. The audit process is a review of the data entered on three screens: 1) Learning Events, 2) Other Activities, and 3) Impact Statements. Educators will be asked to share their process for gathering data and the documentation that they have for what they entered into DM. In May and June, Julie Huetteman, will communicate with five Educators and their Area Director and Program Leaders about scheduling this 1-hour appointment as fits into the Educator calendar in the upcoming months.

Questions? Please email Julie Huetteman, jhuettem@purdue.edu.