- Please let us know if you have questions or if we can be of assistance. Email DMhelp@lists.purdue.edu. Thank you.
- DM Updates for Extension Educators at PDC, December 8, 2020. Slides and presentation recording are available on box.com, "Professional Development Conference 2020 folder". Here is the link.
- REPORTS ABOUT EXTENSION (aka, SHARING OUR STORIES!)
- Purdue Fundamental and Applied Research and Extension Showcase
- Quarterly Report for Extension. See 2021 Q1 report here.
Digital Measures (DM) Office Hours via ZOOM
We have scheduled times to be available via ZOOM to assist Educators entering data and running reports in DM. Educators may drop-in anytime during these sessions to ask questions and get help with their DM activities.
Join Julie Huetteman and Brad Sewell to ask your questions about DM.
Email DMhelp@lists.purdue.edu for the ZOOM connection.
(All times are Eastern)
Join the ZOOM session.
Email DMhelp@lists.purdue.edu for assistance.
PURDUE FAST START PROGRAM - Counties Reporting Engagement
The Purdue Fast Start Program screen in DM is for tracking engagements. Counties may determine how to best document and post in DM. Individual Educators may report engagements. Some Counties may coordinate and have one or two people enter data. Some Counties may have multiple people reporting. Just be certain each County is reporting engagement activities every month.
The Purdue Fast Start Program screen is unique and has a different approach to data entry compared to our other DM screens. Most importantly, we are “counting” IMMEDIATE CONTACTS (not the Direct and Indirect Contacts we report on our other Extension screens). Please, read carefully before entering data.
Purdue Fast Start Program INSTRUCTIONS
NEW! Data Entry for 2021 County Fairs
Do you have questions about documenting Purdue's Fast Start Program? Please email DMhelp@lists.purdue.edu.
Schedule for reporting
- September 1 is the deadline for your annual review process.
- By the 5th business day, report your monthly:
- Communiqué to your District Director, and
- LEARNING EVENTS and OTHER ACTIVITIES in DM.
- Learning Events – education programs/workshops delivered to the public in person or via technology
- Other Activities – consultations, community committees/boards/coalitions, popular press (educational articles, newsletters, TV/Radio spots), professional publications
- Deadlines are May 1 and November 1 to report IMPACT STATEMENTS, (narratives sharing the success of a program) a minimum of two reported each year in DM (individual and/or team as appropriate to your role).
Monthly Communiqué or DM?
Educators do both each month. Use the communiqué for reporting activities toward your goals. Then, go to DM to enter outputs (on the learning events and other activities screens).
Targeted Set of Metrics in DM
While we appreciate the totality of Educators’ efforts, in DM we do not try to capture everything Educators do. Our approach is to collect a targeted set of metrics focused on key activities.
However, the communiqué can be more flexible than DM, with a narrative for documenting progress and activities, and marketing and other activities as needed for Educators.
Also, CED administrative tasks will go in the communiqué only, not in DM.
Lastly, please remember that OUTPUTS are the first step. We also need to report OUTCOMES and IMPACTS which are the most important information we share. We must be quick and efficient in recording outputs monthly so we may focus on our goal of producing outcomes and impacts.
Program Planning vs. Metrics
To compare the Educators' monthly communiqué and DM, in general, you report progress of your efforts and all that you do in the planning process in the monthly communiqué. Then when you implement your program (learning event), you report the metrics. See this diagram.
Learning Events, Other Activities, Impact Statements
Instructions for Extension Educators to enter data into Digital Measures for:
HELP TIPS: When in Digital Measures, click on the question mark to open a separate window with instructions. You will see "help tips" with each data field.
Running a DM Report
To run reports of the data entered, check the instructions here.
Need help writing an impact statement?
Since Digital Measures does not save your data as you go, and since it can also “time out” if you leave it open and idle for some time, a best practice here would be to create, edit, finalize, and save your impact statement in WORD, then cut and paste it into DM.
You may also use the “PasteBoard” in DM as a place to cut and paste your text from the WORD document. Once you put the text in the PasteBoard, check for and correct any text/character issues in the conversion, then highlight it, drag it, and drop it in the appropriate text box.
Issue (Who Cares and Why)
In about three sentences, state the issue or problem addressed. Describe the problem. Explain the relevance of this issue. Why is it important? Information provided here demonstrates what the issue is or what the needs are. This section is about why the program is needed. Introduce any statistics that may illustrate the problem/issue in the state or among the population. Share any needs assessment data you have gathered to indicate the prevalence or importance of this issue.
What Has Been Done
In about three to five sentences, describe what you or your team did. What was your response to the issue/problem? What did you do to address the problem? How did your work resolve the problem? Describe program activities that were conducted. Explain the key elements of your program. Describe the target audience. Describe the delivery of your program/project. Include the quantity of these activities (e.g., four-session weekly series; 59 childcare providers; six communities, etc.). Provide a description of the
program/project. You might describe what was taught and why, who was the audience (e.g., how many attended, etc.), and how was it implemented (e.g., nine-week series of workshops).
Now, share results from the program/project from the perspective of the participants or attendees. What did they get out of it? What did they learn? If a follow-up evaluation was used, report on what actions they have taken or changes they have made since attending the program. Include numbers or percentages to report your evaluation. Include economic indicators if appropriate. Include a narrative about, or from, the attendees.
As appropriate, combine quantitative data (e.g., number, percentage, dollars, etc.) and qualitative data (e.g., anecdotes/narratives or quotes from participants on program evaluation. (Don’t share names but do include their titles/roles, such as, for example, parents, producers, childcare providers, etc.).
This is the most important part of your impact statement and most likely the longest section. Tie the results back to the problem set in “Issue” above. Describe what happened as a result of the efforts described in “What Has Been Done.” What changed as a result of the Extension effort? What difference did this make for Indiana residents? What are the benefits? What is the impact of this effort? For short-term changes in knowledge, attitudes, skills, and aspirations, consider including a statement about what that change does for, or means to, participants. Describe the difference your program/project made for the people of Indiana, and the state’s communities, families, businesses, environment, etc.
Need more information on OUTCOMES?
Outcome indicators are statements created from Outcomes/Impacts posted on program area logic models. These indicators are written in broad concepts to capture a limited list of key results of Extension efforts. The approach to outcome indicators varies between program areas based on the structure and coordination of programs.
Short-term — Outcome indicators for Learning Events are short-term and generally focus on knowledge gained or intentions to take action. Select outcome indicator(s) only if event is completed and results are in hand as reported for this month, and enter the number.
Medium- to long-term — With Impact Statements, outcome indicators are medium to long-term, that is, from behavior change or adopting a practice, to condition, environmental, economic or social changes.
Looking for our logic models (containing outcomes) by program area?
- 4-H Metrics & Logic Models
- ANR Metrics & Logic Models
- CD Metrics & Logic Models
- HHS Metrics & Logic Models
Need to know more about Purdue Extension Goals?
This document shows the goals and priorities for Extension programs.
Digital Measures Liaisons
Thanks to many, many Educators who helped out during the testing and development of the current metrics approach for Extension, and any updating activities. Now, we have a group of “DM Liaisons” for each area across the state who are resources for Educators.
Area 1 – Harriet Armstrong, Bartholomew, Open _____
Area 2 – Megan Broughton, Washington, Open _____
Area 3 – Hans Schmitz, Posey, Open _____
Area 5 – Jenna Nees, Putnam; Jay Christiansen, Vigo
Area 6 – Sarah Hanson, Johnson; Scott Gabbard, Shelby
Area 8 – Adam Shanks, Clinton, Open _____
Area 7 – Molly Hunt, Delaware, Open _____
Area 11 – Caroline Everidge, Huntington, Molly Hoag, Wells
Area 9 – Monica Nagele, Montgomery, Open _____
Area 10 – Mary Foell, LaPorte, Open _____