Pixelmator v Acorn v Photoshop v GIMP :: Two Aspects

In an effort to find a Photoshop alternative with a significantly lower cost, we've purchased both Pixelmator and Acorn. Last fall I worked mostly with Acorn and GIMP, and I'm hoping to use Pixelmator more this fall with the DOCC node I work with.

Creating a defined selection box of X pixels by Y pixels
Acorn: Yes
GIMP: Yes
Photoshop: Yes (link is to one description of doing this in an older version, but I expect it's been substantially carried forward.)
Pixelmator: No
Animated GIFs
Acorn: No (sin of omission)
GIMP: Yes
Photoshop: Yes
Pixelmator: No

Nature Walk 2015

Today’s Nature Walk Tour was, by far, the most engaging and exciting learning moment in which I’ve participated since coming to Yale a year ago. What an outstanding demonstration of the seamless and responsible integration of technology with learning, and of teaching that is focused on student-driven, transcompetent, and holistic pedagogy.

The tour is part of an ongoing project for Marta Wells's course Evolution, Functional Traits, and the Tree of Life.  The project "engage[s] students and the community while promoting awareness of Yale’s natural resources. We intend to create a public Nature Walk with the data students have contributed. We will welcome creative submissions from the community, such as artwork, poetry, photography, videos, or other types of media."

The tour started with a warm welcome from the professor, Alina, and Matt; we were invited to taste Chirps, a cricket-based crisp (http://www.sixfoods.com/#products). As we moved outside, the sensory stimulation continued: we sang and listened to spoken word poetry, smelled bark, touched acorns, and visually absorbed the details of our surroundings. The technology was present but not overwhelming; in fact, a few of us commented that being filmed in a group for a learning resource felt more comfortable and inviting than being filmed in a formal studio for a flipped lesson. These are the types of video resources I would learn from and want to watch if forced to sit in front of a monitor for hours. The sensory stimulation created strong residual messages as bits of knowledge were absorbed almost psychosomatically.

Alina Nevin’s GIS app worked brilliantly, only emphasizing how well she and Matt have blended the technology to subtly enhance the course.

I can’t say enough positive about this learning experience and the marvelous things being accomplished in the course –it is a major inspiration, and a world of thanks for being invited to view and participate. What a perfect way to spend Earth Day 2015!  --Dana Milstein

Link

When Graduate Students Become Online Teachers

We created this site with the goal of supporting graduate students with a basic toolkit of knowledge tools for first-time online teaching. [P.S. We do not focus on institutional support that teachers must have to be effective online teachers.]

This resource is a work-in-progress. We have opened up this site as a shared learning resource so that we may learn from others how to better improve this resource. We welcome your feedback!

Faculty Bulldog Days Review

It's all over but the reflection for the professors and for the CTL organizing staff, and I have finished sitting in on three classes during Faculty Bulldog Days* for spring 2015. Here are some thoughts about that.

Before I talk about the teaching, I can't thank enough the professors who volunteered to have someone come observe their class. We don't have a strong and pervasive culture of openness at Yale, so I thank the professors for standing up and making their teaching work more visible. In the same breath, I want to thank the students in the classes for having a stranger (two, in one of the classes I attended) in their midst. The largest of the class sessions I attended maxed out at 20 students, making interlopers noticeable. Naturally, the five-student class had discussed opening up beforehand, but even the others accommodated visitors seamlessly.

So what about that teaching? Because the sign-up form didn't have a box on it to check for "Yes, I would like any minor mistake or idiosyncrasy made in my class to be splashed across a low-traffic instructional technology blog", I'll only mention things I noticed and liked. (Try not to chafe too much at the vagaries, because even revealing the discipline of a class would pull back the curtain a little too much.)

  • A particularly nice technique I saw was using an un-articulated motif in the class but then at some point in the session raising the motif to a conscious level. If activating prior knowledge contributes to learning, working with this idea at varying scales of "prior" — even within one class — makes sense.
  • Another teacher, in an effort that seemed effective, very noticeably phased in participation over the course of the class. Students engaged in heavier lifting at the beginning, with the professor only nudging along; as the discussion got denser and more challenge-laden (in a good academic way, I thought), the professor increasingly helped portage.
  • In a final example, and at the risk of being banal, one teacher engaged very personally with the work under discussion. Fortunately, the work was comedic, so laughter demonstrated their** engagement, but that personal commitment can make the difference for some students.

Taking on affective filters is a fine line, of course: Are you giving students a glimpse into personal meaning or risking scaring them off something they don't connect with in the same way? My bias is for not hiding how you feel about what you're teaching, for not pretending that scholars hold absolutely everything at arm's-length. By the same token, of course, you have to model critical engagement with the topic and critical engagement with how you feel about it.

I pepper my thoughts with conditionals and hedging, because this was drive-by observation. Some classes gave me prep work, some didn't. Even so, all the people involved in these classes had worked with and through scores of ideas, hundreds of pages of reading, and hours of lecture and/or discussion before I got there and without which I can't form any strong conclusions. This highlights one of the difficulties in mounting this sort of event. While there's no explicit pressure to participate, the implicit social expectations don't go away. If you're an untenured faculty member teaching in front of a high-ranking admin, who may be from a radically different field's teaching traditions, how do you keep it together? There's enough potential benefit (and actual benefit for me) in this event that I hope we do it again, but I hope we never stop trying to make sure it's a scaffolding exercise for the participating faculty rather than an unrewarding chore.

* Honestly, I wish we'd called it something like Classroom Open House or Sharing Our Teaching, or similar, as I don't make the same associations with a prospective student event that I do with this. I do hope, though, that prospective faculty hires are indeed able to sit in on a class or three, and not just in their department of recruitment, during their visits here.

** Gender-obscuring pronouns. Live it, love it.

In an item in yesterday's Yale Daily News about Yik Yak, one professor is quoted as seeing potential there:

[Aleh] Tsyvinski said that as a professor, he rarely gets feedback during the term. He added that he wishes there were an anonymous board, similar to Yik Yak, dedicated to continuous feedback.

Alina Nevins wins Spot Award

spot-alina-nevinsOur very own Alina Nevins won a CIO Spot Award from ITS. From the website:

A Support Technician wrote on behalf of several members of the ITS Help Desk to express thanks to Academic Technologist Alina Nevins, who wrote high-quality knowledgebase articles supporting Classes*v2.  "The articles are very well written and clearly spell out what we need to know,” he wrote. “I received a number of emails about Classes*v2 this morning. The articles made it very easy for me to provide information to the clients."

From ITG

Alina not only wrote high-quality knowledgebase articles supporting Classes*v2. She has taken over the very large support shoes of our dear retired colleague, Gloria Hardman (and is doing an extraordinary job of it). Alina offered hands-on training to HelpDesk staff regarding the use of V2. It's our first service which has tier one support at the help desk. She's managing the V2 queue and our great staffer Jennifer Colafrancesco's work on the service, mastering Drupal for course and educational technology services support, has just gotten a new "Senior Academic Technologist" title and is an all-around great team member.

Thanks Alina and congrats from the crew!

ELI 2015 Conference Notes

Highlights from the ELI 2015 conference in Anaheim, CA (besides the 75 degree weather).

blendkitFrom my alma mater, the University of Central Florida. This mooc/resource is for helping faculty and institutions created blended learning courses. From the website: The goal of the BlendKit Course is to provide assistance in designing and developing your blended learning course via a consideration of key issues related to blended learning and practical step-by-step guidance in helping you produce actual materials for your blended course (i.e., from design documents through creating content pages to peer review feedback at your own institution).

The-Symbiotic-Research-ToolkitA research toolkit for students from Georgia University. The idea being that students don't always know how to use the internet as a resource for research. Might be a good resource in the CTL.

 

 

 

trophyNot Everyone Gets a Trophy - Mark De Vinck, Dexter F. Baker Professor of Practice in Creativity, Lehigh University

Outcomes: Understand the importance of creativity as it relates to innovation, understand the value of hand-on learning, learn how to teach failure without failing.

This faculty member runs a maker lab at the university and provides structured lessons to help students overcome failure. Teaching them to be persistent and resilient in the face of failures. Each student keeps a inventors notebook which documents their tries and processes as well as ideas around real hands-on work on problems. He's found the most useful boost for innovation is the creation of a safe space for students to explore all ideas and to approach obstacles as opportunities for learning. Claims that all is needed for a maker space is a couch, a popcorn maker and coffee. De Vinck talks about using systematic creativity (nicely defined by Mindfolio here) and the 6 hats of creativity defined as follow (found in wikipedia) :

Six distinct directions are identified and assigned a color. The six directions are:

  • Managing Blue - what is the subject? what are we thinking about? what is the goal?
  • Information White - considering purely what information is available, what are the facts?
  • Emotions Red - intuitive or instinctive gut reactions or statements of emotional feeling (but not any justification)
  • Discernment Black - logic applied to identifying reasons to be cautious and conservative
  • Optimistic response Yellow - logic applied to identifying benefits, seeking harmony
  • Creativity Green - statements of provocation and investigation, seeing where a thought goes

Take aways: I enjoyed this talk and the enthusiasm of the professor. Wondering if we could incorporate this type of problem solving to humanities courses. These innovative and maker space work well in engineering and other sciences. But maybe in Public Humanities, Public Health? Perhaps where working as a group to discover underlying concepts in a discipline might benefit from using a systematic approach to thinking innovatively.

harvardcrestLearning at Scale and the Science of Learning - Justin Reich, Richard L. Menschel HarvardX Research Fellow, Harvard University

Outcomes: Learn how to distinguish between participation data (which is abundant) and learning data (which is scarce), learn about taxonomy of current research approaches ranging from fishing in the data exhaust to design research in the core, understand the importance of randomized experiments (A/B testing) to advancing the science of learning.

At the last ELI conference in New Orleans in 2014, MOOCs were in high profile. They were reaching a peak of interest and a flurry of activity. Now that we've got a body work done, we seem to be entering a time of deep assessment of the outcomes. Reich has written a few white papers about his research of HarvardX (one found here). Just like there is diversity in learning experiences there is also diversity of goals. Diversity is central to understanding of the enterprise. There is a difference between measures of activity and measures of learning - there's a lot of data about what people click on but not what goes on in their heads. The question is: What kinds of things are students doing that are helping learning outcomes?

Reich believes we should reboot MOOC research by offering suggestions for how we might do more research on the learning rather than the engagement (clicks).

Improving structures for assessment

  1. measure full range of competencies
  2. measure change in competency over time
  3. borrow validated assessments from elsewhere

MOOCS research has the following options at this time:

  1. fishing in the exhaust (tracking in the clicking data)
  2. experiments in the periphery - domain independent (don't have anything to do with the disciple being taught i.e. learning styles or course design options) which means you can plop it into different domains (disciplines)
  3. design research in the core - helps explain how to get students past a barrier or threshold and how to help students learn core concepts in a course better

Take aways: I thought this talk highlighted just how hard it is to create meaningful assessments of learning in an area that has such a diverse set of students. It would seem to me that assessment might be based on the goals of the groups who want information about how MOOCs are doing. Faculty would probably be interested in assessing if students are understanding the core concepts of a course, administration might be more interested in enrollment numbers and completion rates (perhaps the amount of clicking), students are probably interested in ease of use and the quality of the material - this group is probably the hardest to understand in terms of what their goals are, it's a broad group over many lands.

Hope-Creek-Sunset

Frontiers of Open Data Science Research - Stephen Howe, Director of Technical Product Management-Analytics, McGraw-Hill Education.

Outcomes: Learn how data science is being applied to gain new insights about learner and instructor behavior.

The next generation of education technology will use open data to provide a foundation of learning analytics and adaptive learning. The new frameworks will give continuous quality personalized feedback to help align curricula and assessments and help students make course corrections in their learning. Using open data educators can provide measurement and reports that will affect learning outcomes. There are 3 areas that can be explored:

  1. Prescriptive - base line requirements
  2. Predictions - attention lost? off course? don't understand
  3. Prescription - how to adjust - adaptive learning

Using adaptive learning environments and providing realtime feedback, students will have a pathway to what is known and what should be learned next. How can we take the power of adaptive products that are locked into software? Howe stresses the need for open architectures and standardized data output.

  • IMS Standards
  • LTI - interoperability for integration and identity
  • QTI - assessment interoperability
  • Caliper - a new standard for data exchange - common data exchange (JSON)

Howe shared a graphic of 3 main areas. The first area is the data source of learning events which is then converted to a  data common language. From that common language input APIs are processed in a learning analytics store house where the data is sorted. From that storehouse output APIs publish to products (phones, computers, dashboards, etc.) Howe claims you must start with the product that is trying to answer a question (goal outcome). Then you backwards architect how you sort the data.

Take away: The crux of the biscuit is always the open and standardized data source. Hard enough to do across a single institution let alone across many institutions. However, I don't believe we've done enough here at Yale to leverage product API's, and LTI's in our LMS. However I know it's in our sites and roadmaps. Future frontier looks bright.

Overall, I believe the conference themes that resonated throughout were learning analytics, hands-on learning assignments which give students the opportunity to fail and try again, and competency based learning objectives. And did I mention it was really warm and sunny there?

Timeline + Map Web Tool Comparison

I prepared this brief for a pair-taught course on monasticism, in which the professors wanted to explore using chronological, locative, and narrative data from historical, ethnographic, archaeological, literary, and visual sources to facilitate sophisticated comparative analysis. In particular, they hoped students would make connections and distinctions between phenomena that were non-obviously juxtaposable. They wanted to present this data visually on a website, using both a timeline and a map, with some navigational latitude available to site visitors.

Best Options

I've bolded the most salient items for each option.

Neatline

Pros

  • Allows points, lines, and polygons for representing locative data.
  • Date ambiguity representation
  • Sophisticated object metadata
  • Baselayer choices
  • Active development, at an academic institution
  • Self-hosted

Cons

  • Nontrivial learning curve
  • Sophistication accompanied by sophisticated interface that can be distracting/annoying for students. Requires solid explanation and clearly defined metadata requirements
  • Interface impermanence
  • Standalone, no embedding

Representative Example:Ibn Jubayr

TimeMapper

Pros

  • Google spreadsheet for data store (with implication of using Google Form for student contributions)
  • Data decoupled from presentation
  • Wide range of media embedding
  • Responsive design
  • Embeddable in other sites, such as WordPress
  • Good with BCE dates
  • Simple setup and use

Cons

  • Interface impermanence
  • Limited customizability, though can be deployed to Heroku
  • Uncertain development, sponsored by not-for-profit

Representative Example: Panhellenic Competition at Delphi

Timemap.js

Pros

  • Multiple options for data, including both Google spreadsheet (with implication of using Google Form for student contributions) and local
  • Data potentially decoupled from presentation
  • Self-hosted
  • High level of GUI customizability
  • Baselayer choices (though more limited than Neatline)

Cons

  • Old code
  • Interface impermanence
  • Mobile interface unknown
  • Embeddable as an IFRAME only, usability unclear

Representative Examples: Google spreadsheet with additional arbitrary data points, Themed data

MyHistro

Pros

  • Entries commentable
  • Dedicated iOS application for mobile use
  • Clear data export to CSV, KML, and PDF
  • Embeddable
  • Easy to use points, lines, polygons
  • Variable placemark colors
  • Can 'play' the timeline like a slideshow
  • Semi-automatic semi-multilinguality

Cons

  • Tightly coupled data and presentation
  • Privileges linear reading, though not a requirement
  • Unsophisticated design
  • BCE dates don't seem to get calculated and stored accurately

Representative Example: Early Mesopotamia

Commonalities

In all cases, you'll have to identify distinct start and end dates rather than using century-level notation. The individual dates don't have to be more precise than a year. BCE dates are often added by prepending a negation sign before the date (e.g. -200 is 200 BCE).

Additionally, it always needs to be said that at any moment, development on any of these might cease or changes in browsers and student browser usage might render the code unusable. Even on the options being actively developed, the development team might make a material change in the interface, altering substantially how it looks and works. Other technological or cultural changes can't be ruled out.

Other Options

Most other choices focus on locative storytelling, constraining a visitor to moving along a linear path:

Using Online Photo Albums for a Shot Breakdown Assignment


The online photo album is not a complicated tool. But it can be used to do more than share pictures of junior’s first trip to the zoo.
While prose writing is still a crucial skill our students can acquire, our students often need to analyze visual materials or to present analytical material in some visual form. Online photo albums are a simple way to do this. The photo album doesn’t do the work for you; you still need to know:

  • what you’re saying,
  • what the tool does,
  • what your options are–tool-wise, and more basically.

A familiar assignment in film studies is: the sequence analysis or shot breakdown.

  • The student analyzes a short clip.
  • She makes a table representing formal features–e.g.,
    • how many close-up’s vs. long shot’s,
    • how many still vs. moving camera shots,
    • etc.
  • Then the student typically writes a prose essay summarizing the results.

An online photo album can make a good alternative or additional assignment.

  • A visual presentation can be an intermediary assignment leading towards a prose essay.
  • Or it can represent a visual summary, an expression of some of the same ideas in a different form.

Trailer Shot Breakdown Analysis
A photo album will not capture all the nuance of prose: but reducing one’s arguments to a few points, or focusing on developing only a small compass of ideas can have a wonderfully clarifying effect. It’s like writing an abstract with pictures.
Ideas can be expressed in a photo set verbally or through the arrangement of images.

  • In the case of a verbal commentary or analysis, that can be put:
    • in the caption of each image;
    • or it can be inserted through explanatory slides created in Powerpoint or Keynote (or similar).
  • In the case of the visual elements, even changing the sequence of the images allows students to explore different ways of creating meaning. The images can be sorted:
    • sequentially, as when representing the shots of a scene in the order in which they appear in the film; or the images may be sorted
    • logically into types or kinds. In this case, they kinds represent the argument.

To show these options, I have grabbed still frames from the first two minutes or so of The Maltese Falcon.

  • This is a familiar movie, and the opening is (or used to be) used to demonstrate editing and its analysis in Bordwell and Thompson’s Film Art textbook.
  • Normally, a student would analyze a scene or other more-organic unit of a feature film. In this case, I’ve studied only a fragment, as that simplifies the demonstration.


Captions May Supply the Verbal Content.

This Google Plus Photo album adds captions to screenshots to call out important dimensions of the clip (like framing and duration), while other important elements of storytelling are previewed and narrated.

  • Clicking on an image brings up the image on one side of the screen and the caption on the other.
  • Adjusting the browser’s degree of zooming allows the caption to take up more space on the screen, thus making a good balance of image and analysis.
  • As an added bonus, a ‘backdoor’ lets you create an embeddable Flash-based slideshow you can put in Classes*v2, WordPress, etc.

    • The annotation obscures the image, but it’s an interesting option.
    • N.B. To do this, the photo album must be set to public.

Slides May Supply the Verbal Content.

The verbal analytical dimension may be added by generating text slides in Keynote or Powerpoint to create a ‘visual essay’ which analyzes the clips using brief text slides interspersed between frames representing the shots.

  • This example analyzes the shots in order.
    • The text slides preview key things to look for in the subsequent few slides.
      • This approach suggests the viewer will go back and forth to connect the argument and the evidence.
  • A different approach is to organize the shots by type or kind in some way.
    • This album counts how many shots of specific types and then labels and sorts them.
    • By labeling, sorting and re-arranging the elements one is analyzing, patterns emerge that may not have been apparent in the un-remixed text.
      • This could be called the basis of much analytical thinking: re-sorting experiences to compare them and find non-apparent patterns and resemblances.
    • What counts as a ‘type’ can be motivated by one’s curiosities or observations.
      • In this case, the symmetries and asymmetries of gender become revealed in neat ways by sorting the shots by various formal and other criteria--I.e., an idiomatic folksonomy rather than a scientific taxonomy.

In this case, the album with the captions was made first and then the photos copied to new albums.

  • Hence these images all bear captions, though in practice that would likely not happen.

An additional challenge is how to represent movement or duration.

  • Long takes or camera movement could be represented by multiple frame grabs.
  • The one-frame-per-shot ratio has the virtue of visualizing the shot count.
  • Whichever choice the author makes, clearly labeling the result helps orient the end-user.

–Edward R. O’Neill, Ph.D.