Digital Measures (DM) is the system used for annual data entry. Whether you conduct research or Extension activities, you will report into DM for the calendar year. Custom screens in DM capture data needed for required reporting to USDA for Research and Extension activities.

Here is a letter from Dr. Jason Henderson on Extension expectations.

Extension Reports (aka, Sharing Our Stories!)

This website contains information for Extension Specialists to enter data into DM for:

  • Using DM for your CV
  • USDA Reporting
  • Annual Activity Review (if your Department uses DM for this)
  • Publications
  • Web profile

DM Updates for Extension Specialists at PDC, December 8, 2020. Slides and presentation recording are available on box.com, "Professional Development Conference 2020 folder". Here is the link.

Use DM for your CV!

DM is a powerful CV-oriented database that you are free to start using at any time. Send your CV to dmcv@purdue.edu and our DM student workers will enter it for you, saving you time, and getting your started.

USDA Reporting

USDA Reporting

Deadline for reporting Jan.-Dec. 2020: February 16, 2021

Extension Specialists are expected to report outputs, outcomes and impacts into DM. This includes reporting of workshops, conferences, events and recurring programs considered to be structured, educational events for the public.

  • Outputs are your ACTIVITIES - research projects, research publications, Extension publications, workshops, consultations, volunteers, volunteer hours, and direct & indirect contacts.
  • Outcomes are METRICS you use to share results of your project or program.
  • Impact statements are NARRATIVES describing: 1) an issue, 2) what has been done, and 3) the results.

Digital Measures

There are four DM screens for research and/or Extension activities. One or more may fit your role.

  • USDA Research, Extension and Programmatic Impacts
  • Extension Educators or Specialists - Learning Events
  • Extension Educators or Specialists - Other Activities
  • Extension Educators or Specialists - Impact Statements

Your role determines which DM screen(s) work best.

RESEARCH

Instructions for USDA-Funded Research Reporting

DM SCREEN: USDA Research, Extension and Programmatic Impacts

EXTENSION

Instructions for Extension Reporting

DM SCREENS: You may use the USDA Research, Extension & Programmatic Impacts screen and/or the three Extension screens - Learning Events, Other Activities, and Impact Statements.

Running a DM Report

Go to "getting data out" for information on running DM reports to pull Extension data you have entered.

Resources

Need help writing an impact statement?

Since Digital Measures does not save your data as you go, and since it can also “time out” if you leave it open and idle for some time, a best practice here would be to create, edit, finalize, and save your impact statement in WORD, then cut and paste it into DM.

You may also use the “PasteBoard” in DM as a place to cut and paste your text from the WORD document. Once you put the text in the PasteBoard, check for and correct any text/character issues in the conversion, then highlight it, drag it, and drop it in the appropriate text box.

Issue (Who Cares and Why)
In about three sentences, state the issue or problem addressed. Describe the problem. Explain the relevance of this issue. Why is it important? Information provided here demonstrates what the issue is or what the needs are. This section is about why the program is needed. Introduce any statistics that may illustrate the problem/issue in the state or among the population. Share any needs assessment data you have gathered to indicate the prevalence or importance of this issue.

What Has Been Done
In about three to five sentences, describe what you or your team did. What was your response to the issue/problem? What did you do to address the problem? How did your work resolve the problem? Describe program activities that were conducted. Explain the key elements of your program. Describe the target audience. Describe the delivery of your program/project. Include the quantity of these activities (e.g., four-session weekly series; 59 childcare providers; six communities, etc.). Provide a description of the
program/project. You might describe what was taught and why, who was the audience (e.g., how many attended, etc.), and how was it implemented (e.g., nine-week series of workshops).

Results
Now, share results from the program/project from the perspective of the participants or attendees. What did they get out of it? What did they learn? If a follow-up evaluation was used, report on what actions they have taken or changes they have made since attending the program. Include numbers or percentages to report your evaluation. Include economic indicators if appropriate. Include a narrative about, or from, the attendees.

As appropriate, combine quantitative data (e.g., number, percentage, dollars, etc.) and qualitative data (e.g., anecdotes/narratives or quotes from participants on program evaluation. (Don’t share names but do include their titles/roles, such as, for example, parents, producers, childcare providers, etc.).

This is the most important part of your impact statement and most likely the longest section. Tie the results back to the problem set in “Issue” above. Describe what happened as a result of the efforts described in “What Has Been Done.” What changed as a result of the Extension effort? What difference did this make for Indiana residents? What are the benefits? What is the impact of this effort? For short-term changes in knowledge, attitudes, skills, and aspirations, consider including a statement about what that change does for, or means to, participants. Describe the difference your program/project made for the people of Indiana, and the state’s communities, families, businesses, environment, etc.

Need more information on Extension OUTCOMES?

Outcome indicators are statements created from Outcomes/Impacts posted on program area logic models. These indicators are written in broad concepts to capture a limited list of key results of Extension efforts. The approach to outcome indicators varies between program areas based on the structure and coordination of programs.

Short-term — Outcome indicators for Learning Events are short-term and generally focus on knowledge gained or intentions to take action. Select outcome indicator(s) only if event is completed and results are in hand as reported for this month, and enter the number.

Medium- to long-term — With Impact Statements, outcome indicators are medium to long-term, that is, from behavior change or adopting a practice, to condition, environmental, economic or social changes.

Looking for our logic models (containing outcomes) by program area?

Need to know more about Purdue Extension Goals?

This document shows the goals and priorities for Extension programs.

Annual Activity Review

Annual Activity Review

This section will discuss entering faculty accomplishments and activities, which are then used for various reporting purposes:

  • Annual Activities Review
  • Promotion & Tenure
  • ABET accreditation
  • Other university requests​​

This screenshot highlights the fields which are most commonly used in Annual Activity Reviews.

Classes Taught (under "Teaching" header) and Contracts, Fellowships, Grants and Sponsored Research (under the "Scholarship" header) come from BANNER and COEUS/SPS. Please check for accuracy

All data fields which are not highlighted are populated by data entered by individual faculty or by DM students working from your CV. Within each screen, you are able to edit existing entries (unless they are locked), or create new entries using the +Add New Item block on the upper right of the screen.

What data do I need to enter?

This blue boxes in screenshot above illustrate the fields which are most commonly used in Annual Activity Reviews. Brief instructions on how to enter information into DM can be found here.

  • Directed Student Learning (under the "Teaching" header) is the space for entering involvement on thesis and dissertation committees. There should be one Item for each student whose direction you are involved with.
  • Academic Mentoring (under the "Teaching" header) is where you should enter/quantify/describe student mentoring that doesn't fit within an actual class or committee. This would include things like advising student groups or providing career guidance.
  • Publications data (under the "Scholarship" header) can be imported from a variety of databases. We encourage faculty to import publications using a BibTeX file or using a third party. Instructions on how to enter publications are included in he "Publications" tab of the main Faculty page, or though the following links:
  • Presentations and Patents & Copyrights should be entered in their respective spaces under the "scholarship" header.
  • The "Service" header contains opportunities to enter your University service, service to Professional organizations and journals, and Public service.
  • Annual Activities Narratives (under the "Narratives" header) allows you to enter narratives (text descriptions) for all the components that will be included in your Annual Review:
    • Teaching
    • Student Supervision/Mentoring
    • Publications/Presentations
    • Research
    • Leadership & Service
    • Awards & Honors
    • Diversity & Inclusion
    • Engagement
    • Extension
    • International/Global Impacts
    • Departmental Questions covering performance targets and career goals, departmental support & challenges, etc.

Review your report

Go to “getting data out” for information on running DM reports to pull data you have entered.

Publications

Publications

There are multiple methods to upload citations into DM. We highly recommend importing your citations using a BibTeX file (e.g., Google Scholar, EndNote, Zotero, etc.) or using a third party (e.g., Web of Science, Pubmed, etc.). Simple instructions for importing your publications using a BibTeX file or third party can be found here.

Another option is the CV import tool. However, this method is more labor intensive and does not automatically pull in DOIs. New faculty, send your CV to dmcv@purdue.edu and our student workers will upload it for you, saving you time.

Web Profile

Web Profile

Publication, Awards and Honors, and Patents are populated on your college web profile from Digital Measures.

Actions to update your web profile

  • Go to your web profile. If you are happy then that's it! If you want to make changes, log into Digital Measures, go to the Tools section and click on Web Profile. You will see sections for Publications, Awards and Honors, and Patents.
  • Here is a short video showing you how to review and make changes. These instructions include a brief tutorial on how to revise the publications listed on your web profile.
  • To update your personal web profile (i.e. bio, lab pictures, etc.), you need to contact your departmental IT staff. In this instance, you will create the information and they upload it for you.

Note: Data may take up to 30 minutes to update.