[prev in list] [next in list] [prev in thread] [next in thread] 

List:       lon-capa-cvs
Subject:    [LON-CAPA-cvs] cvs: modules /gerd/correlpaper correlations.tex
From:       www <lon-capa-cvs () mail ! lon-capa ! org>
Date:       2006-09-30 20:35:13
Message-ID: cvswww1159648513 () cvsserver
[Download RAW message or body]

This is a MIME encoded message


www		Sat Sep 30 16:35:13 2006 EDT

  Modified files:              
    /modules/gerd/correlpaper	correlations.tex 
  Log:
  Final structure set.
  
  
["www-20060930163513.txt" (text/plain)]

Index: modules/gerd/correlpaper/correlations.tex
diff -u modules/gerd/correlpaper/correlations.tex:1.13 \
                modules/gerd/correlpaper/correlations.tex:1.14
--- modules/gerd/correlpaper/correlations.tex:1.13	Fri Sep 29 15:13:27 2006
+++ modules/gerd/correlpaper/correlations.tex	Sat Sep 30 16:35:10 2006
@@ -74,10 +74,16 @@
 \item We classify the online homework discussion contributions from one course
 \item We deploy the MPEX for comparison as a pre- and post-test 
 \item We are using the pre- and post-FCI, as well as the final exam and course \
grades, as a measure of student learning +\item We correlate online discussion \
behavior and MPEX cluster scores with measures of student learning  \end{itemize}
 
 \section{\label{background}Background}
-Online discussions are a rich source of feedback to the \
instructor~\cite{kortemeyer05feedback}, and their quality and character was found to \
be correlated with the type and difficulty of the associated \
problems~\cite{kortemeyer05ana}, i.e., data exists regarding the influence of {\it \
problem} characteristics on associated discussions. Unfortunately, less data exists \
on the correlation between {\it student} characteristics and discussion behavior, \
because usually only very few student characteristics are known, with the exception \
of the students' overall performance in the course. Thus, one of the few findings was \
the fact that certain discussion behavior, most prominently exhibited on \
``non-sanctioned'' discussion sites external to the course, is negatively correlated \
with performance in the course~\cite{kashy03,kortemeyer05ana}. Also, \
Hogan~\cite{hogan99} assessed eight graders' epistemological frameworks through \
interviews and then analyzed their discussion behavior in a !  science course with a \
particular focus on collaboration, finding a number of correlations. +Previous \
studies indicate that correlations between epistemological beliefs and academic \
performance exist, both directly and indirectly \cite{schommer93,may02}. The problem \
is how to measure these beliefs,  and techniques include surveys, guided interviews, \
and observations.  +Many of these, though, take place in artificial research settings \
and outside the normal course activity over a relatively short time, and research \
results regarding their predictive power are not conclusive: Coletta and \
Philips~\cite{coletta05} found a strong correlation between the FCI Gain and the MPEX \
Score, while Dancy~\cite{dancy02} found low correlations between the MPEX and the the \
performance on homework, tests, and final exams. The discrepancies might all be \
traced back to the ``Product Warning Lab''~\cite{mpexwarning}, that the survey is \
best used to gain insights into the beliefs of the class as a whole, rather than on \
an individual level. +
+Online discussions take place within the regular course context and over its \
complete duration. They are a rich source of feedback to the \
instructor~\cite{kortemeyer05feedback}, and their quality and character was found to \
be correlated with the type and difficulty of the associated \
problems~\cite{kortemeyer05ana}, i.e., data exists regarding the influence of {\it \
problem} characteristics on associated discussions. Unfortunately, less data exists \
on the correlation between {\it student} characteristics and discussion behavior, \
because usually only very few student characteristics are known, with the exception \
of the students' overall performance in the course. Thus, one of the few findings was \
the fact that certain discussion behavior, most prominently exhibited on \
``non-sanctioned'' discussion sites external to the course, is negatively correlated \
with performance in the course~\cite{kashy03,kortemeyer05ana}. +
+Few studies exist on the correlation between beliefs data gathered in research \
settings and actual discussion behavior in the course. For example, \
Hogan~\cite{hogan99} assessed eight graders' epistemological frameworks through \
interviews and then analyzed their discussion behavior in a science course with a \
particular focus on collaboration, finding a number of correlations.  
 \section{\label{setting}Setting}
 The project was carried out in an introductory calculus-based physics course with \
initially 214 students. Most of the students in this course plan on pursuing a career \
in a medical field. The course had three traditional lectures per week. It did not \
use a textbook, instead, all course materials were available online. Topics were \
introductory mechanics, as well as sound and thermodynamics. There was twice-weekly \
online homework: one small set as reading problems due before the topic was dealt \
with in class (implementing JiTT~\cite{jitt}), and a larger set of traditional \
end-of-the-chapter style homework at the end of each topic. The online problems in \
the course were randomized using the LON-CAPA system, i.e., different students would \
receive different versions of the same problem (different graphs, numbers, images, \
options, formulas, etc)~\cite{loncapa,kashyd01}. The students had weekly recitation \
sessions, and a traditional lab was offered in parallel. The course grade wa!  s \
determined from the students' performance on biweekly quizzes, the final exam, the \
recitation grades, and the homework performance. @@ -92,7 +98,7 @@
 
 
 The author analyzed the online student discussions that were associated with the \
online homework given in his course, using the scheme first suggested in \
Ref.~\cite{kortemeyer05ana}.  The student names were not available during \
                classification in order to avoid bias.
-There were a total of 2405 such online discussion contributions over the course of \
the semester. +There were a total of 2405 such online discussion contributions over \
the course of the semester, where one posting counts as one contribution.  
 The following list shows the classifications taken into consideration, as well as \
illustrative examples that would receive the respective classification.  \
\begin{itemize} @@ -226,29 +232,30 @@
 Favorable: I go over my class notes carefully to prepare for tests in this course.
 \end{quote}
 \end{itemize}
-
+The overall scores of the students on the MPEX clusters were low (Independence 42\%; \
Coherence 46\%; Concepts 48\%; Reality Link 55\%; Math Link 40\%; Effort 47\%).   \
\subsection{\label{performance}Measures of Student Learning}  As a measure of student \
conceptual understanding and learning, we deployed the revised Force Concept \
Inventory (FCI)\cite{fci} at the beginning and the end of the course, again with \
voluntary participation. As an additional measure of student performance, the \
performance on the final exam and the course grade for each student were taken into \
consideration. For the grade we used the raw percentage score, not the number grades, \
since it provides finer grained information about the overall student performance in \
the course. +\section{\label{perception}Student Perception of the Online Discussions \
and Survey Instruments} +An additional survey was deployed online after the end of \
the course to gauge students' perception of the online online discussions, as well as \
of the MPEX and the FCI.  
+77 students participated anonymously in this survey.  On a Likert scale, 73\% stated \
that they took the FCI seriously or very seriously, while 65\% stated the same about \
the MPEX. The difference between the answer distributions is however not \
statistically significant. A larger difference was found regarding the question if \
the surveys appeared to be relevant: 62\% of the students found the FCI relevant, \
while 51\% found the MPEX relevant. These distributions have an $\alpha$ of 1.54, \
which comes close to confirming a difference at the $p<0.1$-level.  
-\section{\label{results}Results}
-\subsection{\label{MPEXDiscussion}Correlations between Discussion Behavior and MPEX}
-To directly compare the attitudes and beliefs measures, we calculated correlations \
between the prominence of discussion behavior classes and the MPEX clusters, and \
generally found them to be very low. As an example, the correlation between the score \
on the Concepts Cluster and the prominence of conceptual discussion contributions \
turned out to be $R=0.14 [-0.08 \to 0.34]; n=84$ when considering all students, and  \
$R=0.15 [-0.13 \to 0.41]; n=51$ when only considering those who made at least five \
discussion contributions --  the 95\% confidence intervals (given in square brackets) \
include zero. Thus, we conclude that discussion behavior and the individual MPEX \
cluster scores are -- if at all -- only weakly correlated. +The most surprising \
result was that only 31\% of the students stated that they would be frustrated or \
very frustrated if they did not do well on the FCI, and only 30\% of the students \
stated the same for the MPEX. Particularly the FCI percentage is smaller than \
expected, since the FCI is generally believed to be fairly robust in ungraded \
settings, see for example Henderson~\cite{henderson}, who found only 0.5 points \
difference between graded and ungraded administration of the FCI. Also, the FCI is \
similar to the tests and exams used in the course, and students tend to base their \
relative value system regarding a subject area on the assessments used~\cite{lin}.   
-\subsection{\label{learningcorrel}Correlations between Discussions, MPEX, and \
                Learning}
-Correlations between the MPEX and measures of student learning are generally weak. \
Considering final exam and course grade,  $R=0.36 [0.17 \to 0.52]$ ($n=97$) between \
the score on the Coherence cluster and the course grade percentage is the highest \
correlation found. Dancy~\cite{dancy02} found similarly low correlations with the \
performance on homework, tests, and final exams: direct comparison with the \
performance on the final exams found $R=0.37$ for the correlation with the total MPEX \
score ($R=0.27$ here), $R=0.39$ with the Independence Cluster ($R=0.25$ here), \
$R=0.24$ with the Coherence Cluster ($R=0.36$ here), $R=0.29$ with the Concept \
Cluster ($R=0.25$ here), $R=-.02$ with the Reality Link cluster ($R=0.1$ here), \
$R=0.3$ with the Math Link cluster (no significant correlation found here), and no \
significant correlation with the Effort Cluster ($R=0.1$ here). As a caveat already \
pointed out in section~\ref{setting}, however, the course grade is based on a number \
of fa!  ctors, some of which are simply a matter of diligence or effort. 
 
+On the other hand, student discussions correlate more strongly with performance \
measures. Students are taking them seriously, likely because they are perceived as \
helpful and relevant. In the same post-course survey, 89\% of the students found the \
discussions either helpful or very helpful, and 73\% stated that they used the \
discussions to learn physics, as opposed to 35\% who said they often or very often \
just used the discussions to get the correct result as quickly as possible. \
Discussions appear to be an authentic reflection of what the students perceive as \
good problem solving strategy:  while an expert would characterize most postings as \
``bad strategy,''   +only 17\% of the students admitted that they often against \
better knowledge used bad problem solving strategies to get the correct result as \
soon as possible, and 48\% stated that they rarely or never did so (35\% were not \
sure).   
-Figure~\ref{mpexfci} shows how the final MPEX and FCI scores correlated with each \
                other, i.e, $R=0.24 [0.04 \to 0.42]$ ($n=97$). 
-Coletta and Philips~\cite{coletta05} found a strong correlation between the FCI Gain \
and the MPEX Score ($R=0.52 [0.24 \to 0.72]; n=37$), while the same correlation \
turned out much lower in this study ($R=0.17 [-0.05 \to 0.37]; n=84$ here). The \
                correlations reported here are in the same range that
-Perkins et al.~\cite{perkins04} found when investigating the influence of beliefs on \
conceptual learning, using the CLASS~\cite{adams04} and the Force and Motion \
                Conceptual Evaluation (FMCE)~\cite{thornton98} instruments.
-\begin{figure}
-\includegraphics[width=9cm]{fcipostmpexpost}
-\caption{\label{mpexfci}Correlation of the final FCI score with the MPEX score \
                ($R=0.24 [0.04 \to 0.42]$; $n=97$).}
-\end{figure}
 
-Figure~\ref{physicsgrade} shows the correlation between the prominence of \
physics-related discussions and the course grade percentage (for better statistics, \
only students who contributed at least five discussion entries over the course of the \
semester were considered). The correlation is stronger than with the MPEX Score, yet \
smaller than with the FCI.  
+
+\section{\label{results}Correlation Results}
+\subsection{\label{MPEXDiscussion}Correlations between Discussion Behavior and MPEX}
+To directly compare the attitudes and beliefs measures, we calculated correlations \
between the prominence of discussion behavior classes and the MPEX clusters, and \
generally found them to be very low. As an example, the correlation between the score \
on the Concepts Cluster and the prominence of conceptual discussion contributions \
turned out to be $R=0.14 [-0.08 \to 0.34]; n=84$ when considering all students, and  \
$R=0.15 [-0.13 \to 0.41]; n=51$ when only considering those who made at least five \
discussion contributions --  the 95\% confidence intervals (given in square brackets) \
include zero. Thus, we conclude that discussion behavior and the individual MPEX \
cluster scores are -- if at all -- only weakly correlated. +
+\subsection{\label{learningcorreldis}Correlations between Discussions  and Learning}
+
+Figure~\ref{physicsgrade} shows the correlation between the prominence of \
physics-related discussions and the course grade percentage (for better statistics, \
only students who contributed at least five discussion entries over the course of the \
semester were considered).   \begin{figure}
 \includegraphics[width=9cm]{physicsgrade}
 \caption{\label{physicsgrade}Correlation of percentage physics-related discussions \
with grade percentage ($R=0.33 [0.15 \to 0.49]$; $n=111$).} @@ -266,59 +273,55 @@
 \end{figure}
 
 
-\section{Discussion of the Correlation Results}
-Correlations between Grade, Final Exam, FCI, MPEX, and student discussion behavior \
have turned out lower than expected. The strongest correlations exist with the final \
score on the FCI, namely $R=0.56$ with the grade percentage in the course, $R=0.51$ \
with the prominence of physics-related discussions, and $R=-0.58$ with the prominence \
                of solution-oriented discussions.
-
-An unexpected result were the low correlations between the MPEX cluster scores and \
the student discussion behavior. We can thus not conclude that student discussion \
behavior is strongly correlated with student attitudes and expectations as measured \
by the MPEX. Student discussions and the MPEX also differently correlate to measures \
of learning, i.e., student discussion more strongly correlates to the FCI, and MPEX \
                more strongly to course grades and the final exam.
-
-Regarding the hypotheses stated in section~\ref{hypo},
-\begin{enumerate}
-\item a correlation between performance on MPEX clusters and discussion behavior \
                exhibited online could not be confirmed
-\item a medium negative correlation between the prominence of solution-oriented and \
a medium positive correlation between physics-related online discussions and the FCI \
score could be confirmed, while correlations with other discussion characteristics \
                could not be confirmed on the 95\% confidence level
-\item a positive correlation between FCI and MPEX scores could be confirmed on the \
                95\% confidence level, but is very weak
-\end{enumerate}
-Medium correlations exist between the performance on the final exam and the course \
grade on the one hand, and the FCI performance on the other, but the same could not \
                be confirmed for the MPEX scores.
-\section{Discussion of Possible Causal Relationships}
-A purely correlational study does not allow any conclusions regarding causal \
relationships. In this section, we are discussing some possible causal relations and \
                additional experiments that were conducted to confirm some of these.
-\subsection{Discrepancy in the Correlational Power of the MPEX and the FCI}
-A surprising result is the relative weakness of many of the expected correlations \
with the MPEX, particularly compared to and correlated with the FCI, as well as other \
course-specific performance measures. Previous studies indicate that correlations \
between epistemological beliefs and academic performance exist, both directly and \
indirectly \cite{schommer93,may02}. \
+\subsection{\label{learningcorrelmpex}Correlations between MPEX and Learning}  
-A hypothesis was formed that the students do not take the MPEX very seriously or \
don't find it relevant, and that they do not care greatly how they are performing on \
it. An argument for this possible explanation is that the overall scores of the \
students on the MPEX were low (Independence 42\%; Coherence 46\%; Concepts 48\%; \
Reality Link 55\%; Math Link 40\%; Effort 47\%).  +Correlations between the MPEX and \
measures of student learning are generally weak. Considering final exam, FCI, and \
course grade,  $R=0.36 [0.17 \to 0.52]$ ($n=97$) between the score on the Coherence \
cluster and the course grade percentage is the highest correlation found.   
-To give a more definitive answer, an additional survey was deployed online after the \
end of the course regarding both the MPEX and the FCI. +Dancy~\cite{dancy02} found \
similarly low correlations with the performance on homework, tests, and final exams: \
direct comparison with the performance on the final exams found $R=0.37$ for the \
correlation with the total MPEX score ($R=0.27$ here), $R=0.39$ with the Independence \
Cluster ($R=0.25$ here), $R=0.24$ with the Coherence Cluster ($R=0.36$ here), \
$R=0.29$ with the Concept Cluster ($R=0.25$ here), $R=-.02$ with the Reality Link \
cluster ($R=0.1$ here), $R=0.3$ with the Math Link cluster (no significant \
correlation found here), and no significant correlation with the Effort Cluster \
($R=0.1$ here).   
-72 students participated anonymously in this survey.  On a Likert scale, 74\% stated \
that they took the FCI seriously or very seriously, while 65\% stated the same about \
the MPEX. The difference between the answer distributions is however not \
statistically significant. A larger difference was found regarding the question if \
the surveys appeared to be relevant: 61\% of the students found the FCI relevant, \
while 51\% found the MPEX relevant. The distributions have an $\alpha$ of 1.64, which \
comes close to confirming a difference at the $p<0.1$-level.  
-The most surprising result was that only 32\% of the students stated that they would \
be frustrated or very frustrated if they did not do well on the FCI, and only 30\% of \
the students stated the same for the MPEX. Particularly the FCI percentage is smaller \
than expected, since the FCI is generally believed to be fairly robust in ungraded \
settings, see for example Henderson~\cite{henderson}, who found only 0.5 points \
difference between graded and ungraded administration of the FCI. \
+Figure~\ref{mpexfci} shows how the final MPEX and FCI scores correlated with each \
other, i.e, $R=0.24 [0.04 \to 0.42]$ ($n=97$).  +Coletta and Philips~\cite{coletta05} \
found a strong correlation between the FCI Gain and the MPEX Score ($R=0.52 [0.24 \to \
0.72]; n=37$), while the same correlation turned out much lower in this study \
($R=0.17 [-0.05 \to 0.37]; n=84$ here). The correlations reported here are in the \
same range that +Perkins et al.~\cite{perkins04} found when investigating the \
influence of beliefs on conceptual learning, using the CLASS~\cite{adams04} and the \
Force and Motion Conceptual Evaluation (FMCE)~\cite{thornton98} instruments. \
+\begin{figure} +\includegraphics[width=9cm]{fcipostmpexpost}
+\caption{\label{mpexfci}Correlation of the final FCI score with the MPEX score \
($R=0.24 [0.04 \to 0.42]$; $n=97$).} +\end{figure}
 
-In summary, it can be confirmed that the correlation results with and between the \
MPEX and the FCI might be weak because the students --- in spite of the best efforts \
of the author --- do not really care that much about them, particularly not how well \
they are doing on them. The main difference between the two instruments is that the \
students find the FCI more relevant than the MPEX, likely because the FCI more \
closely matches the other grade-relevant assessments they encounter in the course, \
and students tend to base their relative value system regarding a subject area on the \
assessments used~\cite{lin}.   
-On the other hand, student discussions correlate more strongly with performance \
measures. Students are taking them seriously, likely because they are perceived as \
helpful and relevant. In the same post-course survey, 90\% of the students found the \
discussions either helpful or very helpful, and 73\% stated that they used the \
discussions to learn physics, as opposed to 34\% who said they often or very often \
just used the discussions to get the correct result as quickly as possible. \
Discussions appear to be an authentic reflection of what the students perceive as \
good problem solving strategy:  while an expert would characterize most postings as \
                ``bad strategy,''  
-only 16\% of the students admitted that they often against better knowledge used bad \
problem solving strategies to get the correct result as soon as possible, and 48\% \
stated that they rarely or never did so (36\% were not sure).  +\section{Discussion \
of Possible Causal Relationships} +The study showed that there is a relatively strong \
correlation between solution-oriented discussion behavior (negative) and \
physics-oriented discussion behavior (positive) and the final FCI score. It is an \
interesting question whether the students learned physics better because of their \
more expert-like approach, or vice versa.  
+In an attempt to answer this question, we are considering the FCI gain as a rough \
measure of how much physics the students {\it learned} (versus, for example, knew \
already). We also introduced a measure of discussion behavior gain by splitting the \
semester in half and calculating the the difference between the prominence of \
discussion behaviors in the first and the second half of the semester.   
-\subsection{Discussions Behavior versus FCI and Grade Performance}
-The study showed that there is a relatively strong correlation between \
solution-oriented discussion behavior (negative) and physics-oriented discussion \
behavior (positive) and the final FCI score. It is an interesting question whether \
the students learned physics better because of their more expert-like approach, or \
vice versa. In an attempt to answer this question, we are considering the FCI gain as \
a rough measure of how much physics the students {\it learned} (versus, for example, \
knew already). We also introduced a measure of discussion behavior gain by splitting \
the semester in half and calculating the the difference between the prominence of \
discussion behaviors in the first and the second half of the semester. We then \
calculated the following two correlations: +We then calculated the following two \
correlations:  \begin{itemize}
 \item FCI gain versus prominence of solution-oriented and physics-related postings
 \item FCI gain versus gain in prominence of solution-oriented and physics-related \
postings  \end{itemize}
 
-As it turns out, the first correlations are significant, with $R=-0.44 [-0.65 - \
-0.18] (n=47)$ for FCI gain versus solution-oriented discussions, and $R=0.4 [0.13 - \
0.62] (n=47)$ for FCI gain versus physics-related discussions. Such significant \
correlations do not occur for FCI gain versus any of the MPEX cluster scores. +As it \
turns out, the first correlations are significant, with $R=-0.44 [-0.65 \to -0.18] \
(n=47)$ for FCI gain versus solution-oriented discussions, and $R=0.4 [0.13 \to 0.62] \
(n=47)$ for FCI gain versus physics-related discussions. Such significant \
correlations do not occur for FCI gain versus any of the MPEX cluster scores.  
-On the other hand, the correlations with discussion-gain are not significant: $0.24 \
[-0.05 -- 0.49] (n=47)$ for FCI gain versus gain in solution-oriented discussions, \
and $-0.12 [-0.39 -- 0.17] (n=47)$ for FCI gain versus gain in physics-related \
discussions. Note that these correlations have the opposite sign than expected, \
however, the confidence intervals include zero in both cases. When looking at the \
absolute values, the average gain in solution-oriented discussions between the two \
halves of the semester is $2.4\%$, and the gain in physics-oriented discussions \
$-0.3\%$ --- in other words, the students did not really change their discussion \
behavior over the course of the semester, and their discussion behavior does not \
improve co-measured with their increasing understanding of physics.  +On the other \
hand, the correlations with discussion-gain are not significant: $0.24 [-0.05 \to \
0.49] (n=47)$ for FCI gain versus gain in solution-oriented discussions, and $-0.12 \
[-0.39 \to 0.17] (n=47)$ for FCI gain versus gain in physics-related discussions. \
Note that these correlations have the opposite sign than expected, however, the \
confidence intervals include zero in both cases. When looking at the absolute values, \
the average gain in solution-oriented discussions between the two halves of the \
semester is $2.4\%$, and the gain in physics-oriented discussions $-0.3\%$ --- in \
other words, the students did not really change their discussion behavior over the \
course of the semester, and their discussion behavior does not improve co-measured \
with their increasing understanding of physics.   
 Thus, the discussion behavior appears to be a property of the students that is \
almost constant over the course of the semester, just like Hammer~\cite{hammer94} \
already pointed out that it is unlikely that epistemological beliefs are changed \
                implicitly by physics instruction.
-A more expert-like approach that is reflected in more desirable discussion behavior \
causes students to have higher learning gains in physics.  
+We also ran a linear regression analysis of the FCI scores versus discussion \
behavior. In the equations below, ``PostFCI'' is the predicted post (final) FCI \
score, ``PreFCI'' is the score on the pre FCI, and ``Solution'' and ``Physics'' are \
the percentage solution- and physics-oriented discussion over the course of the \
semester. For the physics-oriented discussion, we found +\begin{equation*}
+\mbox{Post FCI}=5.486+0.922\cdot\mbox{PreFCI}+0.24\cdot\mbox{Physics}
+\end{equation*}
+with an explained variance of 45.6\% of the Post FCI score. The effect of the \
pre-test FCI is significant ($p<0.001$), the effect of the physics discussion is not \
($p=0.195$). +
+For the solution-oriented discussion, we found
+\begin{equation*}
+\mbox{PostFCI}=7.606+0.857\cdot\mbox{PreFCI}+(-0.042)\cdot\mbox{Solution}
+\end{equation*}
+with an explained variance of 47.9\% of the Post FCI score. Both coefficients are \
significant, the solution-oriented discussion has $p=0.019$. Thus, controlling for \
Pre FCI score, for each 10 percent increase in solution-oriented discussion, the \
predicted Post FCI score goes down by 0.42 points.  \section{Conclusions}
-In this introductory calculus-based course, correlations between different \
performance and attitude indicators were found to be lower than expected. Student \
discussion behavior generally correlates more strongly with student performance (FCI, \
final exam, grade) than MPEX results. Particularly the prominence of \
solution-oriented and physics-related discussions correlate relatively strongly with \
the FCI. A more expert-like approach to physics, which is reflected in more desirable \
discussion behavior, causes students to have higher learning gains in physics. On the \
downside, a physics course appears to do little in terms of changing students' \
                approaches to physics.
-
-
-The expected correlation between MPEX clusters and the prominence of different \
classes of student discussion behavior is largely missing. The reason for this lack \
of correlation could not completely be determined in the framework of this study: it \
might be that the mechanisms -- even in related areas -- measure different things, or \
that at least one of them in fact measures very little, or that, as indicated by an \
additional survey, the students did not bother responding to the MPEX with sufficient \
diligence. +Online student discussions have very little correlation with MPEX \
outcomes, but appear to be a good reflection of students' individual beliefs \
regarding the nature of problem solving in physics. Students who exhibit more \
expert-like views and strategies have higher learning success, even when controlling \
for prior physics knowledge.  \begin{acknowledgments}
 Supported in part by the National Science Foundation under NSF-ITR 0085921 and \
NSF-CCLI-ASA 0243126. Any opinions, findings, and conclusions or recommendations \
                expressed in this 
-publication are those of the author and do not necessarily reflect the views of the \
National Science Foundation. The author would like to thank the students in his \
course for their participation in this study, as well as Deborah Kashy for assistance \
with the statistical analysis of the data. +publication are those of the author and \
do not necessarily reflect the views of the National Science Foundation. The author \
would like to thank the students in his course for their participation in this study, \
as well as Deborah Kashy from Michigan State University for assistance with the \
statistical analysis of the data, and Stephen Pellathy from the University of \
Pittsburgh for carrying out the interrater reliability study.  \end{acknowledgments}
 \bibliography{correlations}% Produces the bibliography via BibTeX.
 


_______________________________________________
LON-CAPA-cvs mailing list
LON-CAPA-cvs@mail.lon-capa.org
http://mail.lon-capa.org/mailman/listinfo/lon-capa-cvs

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic