Digital Measures (DM) is the system used for annual data entry. Whether you conduct research or Extension activities, you will report into DM for the calendar year. Custom screens in DM capture data needed for required reporting to USDA for Research and Extension activities.

 

NOTICE: During DM rebuilding process, Fall 2021, all current Extension and USDA screens are hidden. You are not able to enter any data. DM reports are available.

 

 

Here is a letter from Dr. Jason Henderson on Extension expectations.

 

This website contains information for Extension Specialists to enter data into DM for:

  • Using DM for your CV
  • USDA Reporting
  • Annual Activity Review (if your Department uses DM for this)
  • Publications
  • Web profile

 

Use DM for your CV!

DM is a powerful CV-oriented database that you are free to start using at any time. Send your CV to dmcv@purdue.edu and our DM student workers will enter it for you, saving you time, and getting your started.

USDA Reporting

USDA Reporting

Deadline for reporting Jan.-Dec. 2021: February TBD, 2022

Extension Specialists are expected to report outputs, outcomes and impacts into DM. This includes reporting of workshops, conferences, events and recurring programs considered to be structured, educational events for the public.

  • Outputs are your ACTIVITIES - research projects, research publications, Extension publications, workshops, consultations, volunteers, volunteer hours, and direct & indirect contacts.
  • Outcomes are METRICS you use to share results of your project or program.
  • Impact statements are NARRATIVES describing: 1) an issue, 2) what has been done, 3) who were the participants, and 4) the results.

Digital Measures

There are four DM screens for research and/or Extension activities. One or more may fit your role.

  • USDA (Research)
  • Learning Events (workshops, programs, webinars, etc.)
  • Other Activities (consultations, committees, popular press, professional publications, indirect contacts)
  • Impact Statements (narrative or story about the results of your Extension program or project)

RESEARCH

Instructions for USDA-Funded Research Reporting

DM SCREEN: USDA

EXTENSION

Instructions for Extension Reporting

DM SCREENS: You may use the USDA screen and/or the three Extension screens - Learning Events, Other Activities, and Impact Statements.

  • Instructions by screen:
  • Please note: DM Learning Events and Other Activities screens are built for MONTHLY reporting. Educators are required to report monthly. However, Extension Specialists are required to report ANNUALLY. As a result, if Extension Specialists use the Learning Events and Other Activities screens, they may choose to "summarize" their entire year, by putting entries into the month of December.

Running a DM Report

Go to "getting data out" for information on running DM reports to pull Extension data you have entered.

Resources

Need help writing an impact statement?

Since Digital Measures does not save your data as you go, and since it can also “time out” if you leave it open and idle for some time, a best practice here would be to create, edit, finalize, and save your impact statement in WORD, then cut and paste it into DM.

Issue (Who Cares and Why)

  • In about three sentences, state the issue or problem addressed by the Extension program/project.
  • Look to the situation statement of the logic model – the ISSUE comes from that information.
  • Describe the problem, need, concern, or situation. Examples of issues mayinclude:obesity,drought,lackofleadershipknowledgeorskills,ortheneedforstrongerscience
  • Explain the relevance of this issue. Why is it important?
  • Share any needs assessment data you have gathered to indicate the prevalence or importance of this issue.
  • Introduce any statistics that may illustrate the problem/issue in the state or among the population.

What Has Been Done (Describe the program/project)

  • In about three to five sentences, describe what you or your team did.
  • Give the title of the program/project.
  • Describe the delivery or implementation, include quantity of activities (e.g., four-session weekly series; six communities).
  • Indicated that topics that were presented
  • Look to the inputs and outputs section of the logic model – WHAT HAS BEEN DONE comes from that information.
  • Avoidusingacronyms,abbreviations,and
  • Write as if you are explaining the program/project to someone who doesn’t know anything about it.

Who Were the Participants (Describe the program learners/attendees by roles, numbers and demographics.)

  • Who was the audience (aka learners)?
  • Describe the audience by their roles (e.g., high school youth, childcare providers, parents, farmers, community leaders, agency representatives, land owners).
  • Look to the outputs section of the logic model – details for this section can come from that.
  • How many Youth and/or how many Adults attended? Give the unique number of program participants.
  • Provide participant self-reported demographics information – gender, ethnicity, race and age from evaluation survey, 4-H Online, Survey Builder, Common Measures 2.0 surveys, or CVENT/Salesforce.
  • Provide the total number of learners (youth and/or adult) who completed the evaluation, if applicable.

Results

  • This is a description about what changed because of the program.
  • Share results from the program from the perspective of the audience.
    • What did they learn? Knowledge, attitudes, skills, aspirations.
    • What practices did they adopt or behaviors did they change?
    • How did they benefit from those practices or behaviors?
  • Look to the Outcomes-Impact section of the logic model. Use those to help you create a narrative of the results.
  • Include numbers or percentages to report your evaluation. Include economic indicators if appropriate.
  • As appropriate, combine quantitative data (e.g., number, percentage, dollars, etc.) and qualitative data (e.g., anecdotes/narratives or quotes from participants on program evaluation. (Don’t share names).
  • Describe the difference your program made for the people of Indiana, and the communities, families, youth, businesses, environment, etc.

Need more information on OUTCOMES?

Outcome indicators are statements created from Outcomes/Impacts posted on program area logic models. These indicators are written in broad concepts to capture a limited list of key results of Extension efforts. The approach to outcome indicators varies between program areas based on the structure and coordination of programs.

  • Short-term outcomes focus on knowledge, attitudes, skills, and aspirations. We often measure the knowledge gained or intentions to take action.
  • Medium-term outcomes are measured after some time has passed. Measures focus on participants taking action, changing behavior, or adopting practices or skills that were learned in the program.
  • Long-term outcomes are looking to changes that result from the actions/behaviors/practices/skills  taken. Measures are for condition, environmental, economic, civic or social changes.

 

Looking for LOGIC MODELS?

Annual Activity Review

Annual Activity Review

This section will discuss entering faculty accomplishments and activities, which are then used for various reporting purposes:

  • Annual Activities Review
  • Promotion & Tenure
  • ABET accreditation
  • Other university requests​​

This screenshot highlights the fields which are most commonly used in Annual Activity Reviews.

Classes Taught (under "Teaching" header) and Contracts, Fellowships, Grants and Sponsored Research (under the "Scholarship" header) come from BANNER and COEUS/SPS. Please check for accuracy

All data fields which are not highlighted are populated by data entered by individual faculty or by DM students working from your CV. Within each screen, you are able to edit existing entries (unless they are locked), or create new entries using the +Add New Item block on the upper right of the screen.

What data do I need to enter?

This blue boxes in screenshot above illustrate the fields which are most commonly used in Annual Activity Reviews. Brief instructions on how to enter information into DM can be found here.

  • Directed Student Learning (under the "Teaching" header) is the space for entering involvement on thesis and dissertation committees. There should be one Item for each student whose direction you are involved with.
  • Academic Mentoring (under the "Teaching" header) is where you should enter/quantify/describe student mentoring that doesn't fit within an actual class or committee. This would include things like advising student groups or providing career guidance.
  • Publications data (under the "Scholarship" header) can be imported from a variety of databases. We encourage faculty to import publications using a BibTeX file or using a third party. Instructions on how to enter publications are included in he "Publications" tab of the main Faculty page, or though the following links:
  • Presentations and Patents & Copyrights should be entered in their respective spaces under the "scholarship" header.
  • The "Service" header contains opportunities to enter your University service, service to Professional organizations and journals, and Public service.
  • Annual Activities Narratives (under the "Narratives" header) allows you to enter narratives (text descriptions) for all the components that will be included in your Annual Review:
    • Teaching
    • Student Supervision/Mentoring
    • Publications/Presentations
    • Research
    • Leadership & Service
    • Awards & Honors
    • Diversity & Inclusion
    • Engagement
    • Extension
    • International/Global Impacts
    • Departmental Questions covering performance targets and career goals, departmental support & challenges, etc.

Review your report

Go to “getting data out” for information on running DM reports to pull data you have entered.

Publications

Publications

There are multiple methods to upload citations into DM. We highly recommend importing your citations using a BibTeX file (e.g., Google Scholar, EndNote, Zotero, etc.) or using a third party (e.g., Web of Science, Pubmed, etc.). Simple instructions for importing your publications using a BibTeX file or third party can be found here.

Another option is the CV import tool. However, this method is more labor intensive and does not automatically pull in DOIs. New faculty, send your CV to dmcv@purdue.edu and our student workers will upload it for you, saving you time.

Web Profile

Web Profile

Publication, Awards and Honors, and Patents are populated on your college web profile from Digital Measures.

Actions to update your web profile

  • Go to your web profile. If you are happy then that's it! If you want to make changes, log into Digital Measures, go to the Tools section and click on Web Profile. You will see sections for Publications, Awards and Honors, and Patents.
  • Here is a short video showing you how to review and make changes. These instructions include a brief tutorial on how to revise the publications listed on your web profile.
  • To update your personal web profile (i.e. bio, lab pictures, etc.), you need to contact your departmental IT staff. In this instance, you will create the information and they upload it for you.

Note: Data may take up to 30 minutes to update.