Course Quality Analysis
To perform a course Quality Analysis, I created a QM account and performed a self-review of the course. Most of the standards were met within the presentation of the syllabus itself, such as expectations and technology identifications. Standard 1.6 I had marked 'not met' because I feel that this section pertaining to Computer skills and digital information literacy skills could be improved upon. I don't feel like it necessarily failed, but more context could be provided to the student. Standard 1.7 was not met because I have not thoroughly enough outlined the prerequisites for the course, beyond the college description. This could be very helpful. An email review is pending in the system.
UX review
Accessibility course review
A review has been done on the current status of the course CSUS 320 to determine if it meets all accessibility requirements. What has been found is itemized below. Some of the more simpler aspects of accessibility adaptation do meet the requirements but some signification changes are needed, in particular on the supplemental material for the course, such as PDF documents and videos and images.
- Text and Contrast - PASS: Black text and white background for high contrast is already in use.
- Text Styles - PASS: Other text styles such as bold and italics are in use to denote emphasis.
- Heading Styles - FAIL: Documents provided online for student review need to be reformatted to contain the proper dispersal of headings.
- List Styles - PASS: List styles employ bulleted and numbered components for proper organization
- Alternative Texts - FAIL: Images do not have descriptive text attached
- Multiple Avenues for Multimedia - FAIL: The multimedia that has been incorporated with the course material has not been captioned or provided with an alternative methods of material dissemination.
- Added Context - PASS: descriptive links have been provided where needed on hyperlinks for contextual navigation.
- Tables - N/A: no tables are included in the course material.
Elements of an engaged learning space
As a teacher of online and hybrid courses for many years, I have found a few examples of engaged learning spaces to be beneficial for student success.
- Self-experiential learning: at least twice during each semester, I require my students to engage in their personal world experientially and employ one or more aspects of the course material into the experience. For instance, when discussing Wetlands during the semester, I require them to seek one out, photography and report back about their experience and findings. This puts a real-world engagement on read material, and allows for deeper learning and better understanding.
- Digital media options: Each semester I allow my students the choice of alternative course material production. For instance, to disseminate projects, they can choose to create a podcast or video or animation, etc. This allows a creative space for subject matter that might be scientific in nature, but taps into their personal learning styles. This type of creation space allows for deeper engagement and ownership of the process and outcomes. A deliverable with deeper student investment.
- Live video conferencing: I also require all of my online students to engage with me through some form of live video conference, both with peers, and individually with myself. Providing this personal yet virtual engagement space reduces the anonymity of the online courses, and increases student investment and engagement.
Proposed discussion activity
For this week with Wetlands - you have no reading, and you will need to visit a wetland on campus (unless you want to go to a different one), take at least 3 photos, and describe what you are seeing based on the overview of our discussion in class today. With this photo, describe the elements of a wetland that are present. Discuss how the wetland could, or is, affected by nearby or surrounding development, and describe in detail what this development is and how it impacts wetlands. The lecture will be posted under WEEK 12 - WETLANDS, as will the discussion post to upload your photos and comments. Each student will respond to two peer postings and will comment on 1.) the comparisons found in the wetland that they posted, and 2.) how accurate and/or inclusive the post is regarding wetland and development characteristics.
I have effectively employed this discussion in my course on a couple of occasions. This activity accomplishes both discussion and experiential learning.
I have effectively employed this discussion in my course on a couple of occasions. This activity accomplishes both discussion and experiential learning.
Screencasting
submitted on AL883 course site.
Course Introduction Project
In the image below, I have created a very basic screenshot mockup of a landing page. This page has links to each week of instruction in which the students can find current, future, and archived assignments. The bottom of the page has an area in which the students can see each assignment for the current week of instruction, upcoming due dates, etc. In the middle of the page, a button is linked to the syllabus, and another linked to the course schedule. Above that, is the video box. For the timeframe from course enrollment to the end of the first day of the course, a video will be available of the instructor welcoming the students to the course, with a verbal description of the course and the expectations for the course. After the first day of class, this will become the video placeholder for weekly lectures.
Research article review
Stephen D. O'Connell & Guido Lang (2018) Can Personalized Nudges Improve Learning in Hybrid Classes? Experimental Evidence From an Introductory Undergraduate Course, Journal of Research on Technology in Education, 50:2, 105-119, DOI: 10.1080/15391523.2017.1408438
A study was done to determine how the effect personal email contact might be in perpetuating student success. In this case, student success was qualified as study consistency and learning outcomes. The underlying issues that was central to performing this study was “as education shifts to more flexibly delivered content, students become more personally responsible for the administration of their own learning effort.” This is a factor that many students of hybrid study struggle with. Accountability comes in a different form, and often student participation can waiver. I experience this myself even as a PhD student. Life gets busy and time management is different in a flexible setting. Often scheduling of projects and workload and even peer interaction can suffer.
For this particular study, the researchers based their work in a rather thorough body of previous research, particularly regarding student effort provision, and time use. They found that the literature was primarily geared toward in class sessions however, and worked to expand this examination to incorporate literature that dealt with student self-reported time use, extracurricular time constraints, among others.
The researchers conducted their study over three semesters and 13 sections of the same course, with a consent rate of about 345 participants. They created a web based platform so that they could monitor individual online active engagement with the course. Emails were sent to students on days that they did not have scheduled class engagement. Nudges contained in the email were based on previous studies.
The findings showed that students were highly effected by the email nudges in regards to their personal success on exams, but only negligible increases in course participation. There was a marginal increase in weekly study time. Email reminders closed the gap between login times for students, showing that they were more likely to log in sooner when receiving the reminders, as well as increased logins on weekends, and increased study times.
Overall the study showed that the simple automated but personalized email reminder about course engagement is an effective tool in increasing student study engagement and exam success, and could be an effective tool for hybrid instructors to employ. The researchers acknowledge that the study was not able to determine how actual study times were effected with personal behavior changes as there were too many variables at play. Regardless, I feel that this is not necessary to know that this simple tool is an effective one.
A study was done to determine how the effect personal email contact might be in perpetuating student success. In this case, student success was qualified as study consistency and learning outcomes. The underlying issues that was central to performing this study was “as education shifts to more flexibly delivered content, students become more personally responsible for the administration of their own learning effort.” This is a factor that many students of hybrid study struggle with. Accountability comes in a different form, and often student participation can waiver. I experience this myself even as a PhD student. Life gets busy and time management is different in a flexible setting. Often scheduling of projects and workload and even peer interaction can suffer.
For this particular study, the researchers based their work in a rather thorough body of previous research, particularly regarding student effort provision, and time use. They found that the literature was primarily geared toward in class sessions however, and worked to expand this examination to incorporate literature that dealt with student self-reported time use, extracurricular time constraints, among others.
The researchers conducted their study over three semesters and 13 sections of the same course, with a consent rate of about 345 participants. They created a web based platform so that they could monitor individual online active engagement with the course. Emails were sent to students on days that they did not have scheduled class engagement. Nudges contained in the email were based on previous studies.
The findings showed that students were highly effected by the email nudges in regards to their personal success on exams, but only negligible increases in course participation. There was a marginal increase in weekly study time. Email reminders closed the gap between login times for students, showing that they were more likely to log in sooner when receiving the reminders, as well as increased logins on weekends, and increased study times.
Overall the study showed that the simple automated but personalized email reminder about course engagement is an effective tool in increasing student study engagement and exam success, and could be an effective tool for hybrid instructors to employ. The researchers acknowledge that the study was not able to determine how actual study times were effected with personal behavior changes as there were too many variables at play. Regardless, I feel that this is not necessary to know that this simple tool is an effective one.
Current trends in educational technology
Response to:
Rayfield, John, et al. Identifying Innovative Agricultural Education Programs. Journal of Career and Technical Education, Vol. 27, No. 2, Winter, 2012. pp 38-50
Educational technology availability and advancements clearly vary per field, and as such some fields are more limited than others. For instance, Ag Ed is often founded in experiential teaching approaches and due to the nature of this field of education, can sometimes suffer with the incorporation of educational technologies. In 2012, John Rayfield, et. al. conducted a study which followed with a published a paper that worked to identify the innovations in Agricultural Education, and the industry goals for the field. The goal was to identify what a prime Ag Ed program would look like by the year 2020, in an effort to be a representation of standards for that academic field. The study addressed the occurrence of current innovative programs, and then broke the facets of those programs down into identifiers of what constituted an innovative program, or what was needed. While many of those innovations worked to encompass and progress experiential aspects of the existing programs, some of the identifiers were technology based.
Three rounds of surveys were conducted by the researchers. The first round identified that the use of current technologies was imperative to meeting the goals. Further, “Adequate facilities (new or renovated) to encompass modern agricultural and educational technologies, classroom and laboratories” were a key finding. Unfortunately, these two characteristics were the only two educational technology components listed out of 51 variables. And to continue, there was no identification of what those technologies are or could be in the future. This instills in my review that there are still massive limitations on the incorporation of educational technology advancements in the field of agricultural education. The article makes it unclear why that might be, or any efforts to advance this form of learning engagement. Perhaps this is a field that would not be better served by the incorporation of such technological advances, but I find that difficult to accept, as the agricultural industry as a whole has progressed to such an advanced stage. Learning engagement opportunities that incorporate more online, or hybrid, or robotic, or a plethora of other tools, could potentially be highly accessible and beneficial.
Rayfield, John, et al. Identifying Innovative Agricultural Education Programs. Journal of Career and Technical Education, Vol. 27, No. 2, Winter, 2012. pp 38-50
Educational technology availability and advancements clearly vary per field, and as such some fields are more limited than others. For instance, Ag Ed is often founded in experiential teaching approaches and due to the nature of this field of education, can sometimes suffer with the incorporation of educational technologies. In 2012, John Rayfield, et. al. conducted a study which followed with a published a paper that worked to identify the innovations in Agricultural Education, and the industry goals for the field. The goal was to identify what a prime Ag Ed program would look like by the year 2020, in an effort to be a representation of standards for that academic field. The study addressed the occurrence of current innovative programs, and then broke the facets of those programs down into identifiers of what constituted an innovative program, or what was needed. While many of those innovations worked to encompass and progress experiential aspects of the existing programs, some of the identifiers were technology based.
Three rounds of surveys were conducted by the researchers. The first round identified that the use of current technologies was imperative to meeting the goals. Further, “Adequate facilities (new or renovated) to encompass modern agricultural and educational technologies, classroom and laboratories” were a key finding. Unfortunately, these two characteristics were the only two educational technology components listed out of 51 variables. And to continue, there was no identification of what those technologies are or could be in the future. This instills in my review that there are still massive limitations on the incorporation of educational technology advancements in the field of agricultural education. The article makes it unclear why that might be, or any efforts to advance this form of learning engagement. Perhaps this is a field that would not be better served by the incorporation of such technological advances, but I find that difficult to accept, as the agricultural industry as a whole has progressed to such an advanced stage. Learning engagement opportunities that incorporate more online, or hybrid, or robotic, or a plethora of other tools, could potentially be highly accessible and beneficial.