Hello friends! Recently I had the opportunity to participate both as a visual notetaker and as a course facilitator in TechChange‘s 311 course on Technology for Data Visualization. The notes below are from Sara Dean‘s session on using mapping and dataviz software to visualize infrastructure for M&E projects. Sara spoke about her fascinating work at Stamen, as well as all of the benefits of using maps (that you can create yourself through Stamen’s resources!) for your data collection and analysis projects. Take a look at the highlights below:
Happy 2016! I hope you all had a very happy and healthy New Year. Before the holidays, I participated in a session from TechChange’s course on Technology for Data Visualization, featuring the ever-wise Ann K. Emery. During the session, Ann explained how Excel can be used for both exploratory and explanatory purposes. Knowing the difference will save you a lot of time and energy! Excel is an exploratory tool because it allows you to explore different ways of looking at your data and can help you see trends or patterns that you may not have seen otherwise. Essentially, you can use Excel as an analysis tool and also for playing with your data. Excel can also be used as an explanatory tool because it allows you to use the charts or graphs you create to explain your message to your target audience. Ann outlines tips for using Excel for both exploratory and explanatory purposes below. To learn more about Ann and her data viz genius, be sure to check out her site. Until next time, friends. I hope your year is off to a great start!
Happy Monday evaluators! I hope you all had a wonderful weekend. A few days ago, I participated in TechChange’s TC111 session with Dr. Kerry Bruce, Chief Measurement and Impact Evaluation Officer at the Global Fund to End Slavery. In her presentation on “real-time M&E,” Kerry introduced the concept of a “data product” and advocated for the use of both real-time and traditional forms of data to inform organizational decision-making. For those of you who are interested in learning more about real-time data collection methods, take a look at this presentation by my brilliant colleague, Yuqi Wang: Real-Time Evaluation: Tips, Tools and Tricks of the Trade. Scroll down to see highlights from the TechChange course!
Last night, I had the opportunity to attend the acclaimed 2015 Fail Fest in Washington, DC. Hosted by FHI360, Plan International, and TechChange, the event brought hundreds of international development practitioners together to share their failures in a fun and honest way. The biggest take away for me (aside from the fact that poop jokes apparently never stop being funny) was that #FailFestDC is an opportunity for us all to make a genuine commitment to failing forward. Being forthcoming about our individual and collective failures is an important first step, but turning failure into an opportunity to learn and grow is even more necessary. Now that we’ve opened the metaphorical can of worms and spilled the beans about our failures, what are we going to commit to do differently going forward? I’d love to hear your thoughts! Thank you to all of the brave souls who offered up their own failures last night (listed below). Scroll down and take a look at the visual notes from the conference!
- Ann Hudock, Senior VP, Plan International USA
- Partick Fine, CEO, FHI 360
- Jacob Korenblum, CEO, Souktel Digital Solutions
- Nick Martin, CEO, TechChange
- Susan Davis, Executive Director, Improve International
- Piers J.W. Bocock, Chief of Party, USAID LEARN
- Winston Carroo, Director, Agricultural Missions
- Karen Snyder, Director, Free the Slaves
- Tova Scherr, Independent Consultant
- Robert Salerno, Development Specialist, DAI
Hello! I hope you are all having a wonderful week! On November 17, 2015 I had the pleasure of attending USAID Learning Lab‘s conference, “Moving the Needle: Better Development Programming Through Collaborating, Learning and Adapting.” The conference brought together a select group of USAID staff and partners to discuss a shared understanding of CLA (collaborating, learning and adapting) and how to use CLA to improve development programming. The conference was interactive, engaging and well-organized. In addition to several creative exercises (skits, scenarios, voting boards, etc.), the conference also highlighted case studies of how CLA is being used in practice. I was particularly taken with USAID Uganda’s Mission of Leaders Program and the data use simulation to help partners move towards increased data use (see below). I had a great time geeking out over the intersection of M&E, institutional learning, and organizational decision-making with my fellow conference-goers. I would highly recommend looking at the phenomenal CLA tools created by the all-start team at USAID Learning Lab. Take a look at the pictures and highlights from the conference below:
Below is a picture of me at the conference! Recognize the handwriting?
Happy Monday friends! I hope you all had a wonderful Thanksgiving. I am constantly amazed by how much I learn from each of TechChange’s courses. Take a look at the set of notes below from a particularly brilliant session with Vanessa Corlazzoli from Search for Common Ground. Vanessa covered the fundamental principles for integrating M&E and technology:
During the Aspiring Scholars event at #Eval15, I had the opportunity sit down with Michael Quinn Patton and ask questions about Developmental Evaluation, the appropriate role of an evaluator, and trends in the evaluation field. In addition to answering my questions, Michael also explained why principles-focused evaluation and “evaluative thinking” is so important, how knowing whether or not something works is not the same as knowing how to fix it, and provided some tips for aspiring scholars who are pursuing a career in evaluation. Take a look at Michael’s thoughtful reflections below:
When asked to evaluate a complex, multi-year initiative in which players and resources are constantly shifting, how do you make the case for using a Developmental Evaluation approach? In this session from #Eval15, Marcie Parkhurst (FSG), Hallie Preskill (FSG), Jewlya Lynn (Spark Policy Institute) and Marah Morre (i2i Institute) provide some answers. During the session, the audience worked with the presenters to brainstorm ways to advocate for the use of DE when appropriate. The presenters covered key DE concepts; they demonstrated how the design of a DE approach is adaptive, responsive and emergent and how the evaluator serves as a “critical friend” throughout the process. They also acknowledged that DE may not be the best fit for all circumstances, so it’s important to understand the pros and cons of employing a DE approach. Scroll down for highlights from the session!
At the 2015 American Evaluation Association’s Annual Conference in the Windy City, I participated in Amy Germuth‘s professional development course on, “The Psychology of Survey Respondents: Implications for Survey Questions and Response Options.” Amy began the session by reviewing the cognitive response process and then outlined considerations that evaluators should make when creating survey questions and tips for interpreting survey responses. Take a look at more info on the course and other courses taught by EvalWorks here!
At the 2015 American Evaluation Association’s Annual Conference in Chicago, I had the opportunity to take a professional development course on, “Outcome Mapping: An Approach for Considering Complexity, Relationships and Context in Monitoring and Evaluating Social Change.” Simon Hearn of Overseas Development Institute (ODI) and Kaia Ambrose of CARE Canada covered the seven components of intentional design for oucome mapping. They explained key outcome mapping concepts and techniques such as how to map out spheres of influence, identify boundary partners, and create progress indicators. For more information on outcome mapping, take a look at Simon and Kaia’s AEA365 blog post and keep up with the OM community on Twitter. Scroll down to see visual notes from the course!