POLITICAL SCIENCE 625 (27646) | ||
M 7-9:40 | SH 348 | |
R. Hofstetter NH-119 | MT6-7 594-6804 (6-7 pm) | |
rhofstet@mail.sdsu.edu X | /A> |
This course introduces students to basic concepts, theories, and
methods employed in the study of political
behavior. The course focuses on a subset of the political behavior literature,
concerning mass media and selected policy
attitudes. Students will execute a small survey project, prepare a report based
on the project, and also write a research
paper based on a major survey of San Diego adults.
Completion of POLS 515 or consent of professor.
This course is divided into three parts:
During part one, approximately five weeks, the focus is on reading, discussion
of literature including student presentation of article
reviews, and survey design. During part two,
approximately five weeks,
the focus is on fieldwork including execution of a
survey project, and on analysis. The concern is
methodological and
logistical during this period so reading assignments
are very light and class activity
involves discussing practical problems
rather than the formal literature. During part three,
also approximately five weeks, the focus is on the analysis of data, paper
writing, and the presentation of the data
based paper to the class.
Students will complete reading, class presentations, and homework in
a manner which maintains pace with
the topics being discussed and the lectures. Requirements
include attending all lectures and discussions, participating in
all formal and informal class activities, making
brief presentations on published articles to class,
completing survey project as assigned, and writing
a term paper based on analysis of survey data and presenting it to the class.
No student will pass this class in the absence of completing each of his/her
assignments, including the survey project.
Formal presentations of published research studies will
summarize the theory(ies) being tested, general
empirical approach to testing, and general findings.
No examinations or quizzes will be given. Grades
will be based entirely on the quality of the
survey project report, the term paper
submitted at the next to last class meeting of the semester, the quality
of work in surveying, article presentations, and, to a
lesser extent, on participation during
class discussions/lectures. Knowledge of micro-computer
operation and analysis
procedures may be obtained by taking the SSRL SPSSPC+
Workshops for those who have not completed
POLS 515 and/or POLS 516. Limited additional consulting,
documentation, and machine access is available at the Social
Science Research Laboratory and a number of other
computer facilities on campus.
Students should procure several floppy disks.
PREREQUISITE:
MATERIALS:
Required:
Erikson, Robert S., & Tedin, Kent L. (most recent edition).
American Public Opinion: Its Origins, Content, and Impact.
Boston: Allyn & Bacon, 5th ed.
OUTLINE:
REQUIREMENTS:
GRADING:
Survey Project 35%
Term Paper 35%
Class Presentations 30%
Week 1 | Introduction to Political Behavior. | January 23.
| Last day to add or drop February 7.
| i. Review Xeroxed "Empirical Analysis
Paper," pp. 1-3.
| Week 2 | Opinion and Ideology. | January 30.
| Last day to add or drop February 2.
| i. Erikson & Tedin, Chapters 1-5.
| ii. Supplemental Readings:
| Sidney Verba (1996). "The Citizen Respondent: Sample Surveys and American
Democracy." American Political Science Review, Vol. 90, no. 1, pp. 1-7.
| Christopher M. Federico and Jim Sidanius. (2002).
"Sophistication and the Antecedents of Whites' Radial
Polity Attitudes: Racism, Ideology, and Affirmative Action in
America." Public Opinion Quarterly, Vol. 66, No. 2, 2002,
pp. 145-176.
| Dennis R. Hoover, Michael D. Martinez,
Samul H. Reimer, Kenneth D. Wald,
"Evangelicalism Meets the Continental
Divide: Moral and Economic Conservatism
in the United States and Canada," Political
Research Quarterly, Vol. 55, No. 2, (2002),
pp. 351-374.
| Virginia A. Chanley, Thomas J. Rudolph,
and Wendy M. Rahn, "The Origins and
Consequences of Public Trust in
Government: A Time Series Analysis,"
Public Opinion Quarterly, Vol. 646,
No. 3, 2000, pp. 239-256.
| Susan E. Howell and William P. McLean,
"Performance and Race in Evaluating Minority
Mayors," Public Opinion Quarterly, Vol.
65, No. 3, 1999, pp. 321-343.
| Marc J. Hetherington. (1998). "The Political Relevance of Political Trust." American
Political Science Review, Vol. 92, no. 4, pp. 791-808.
| Thomas C. Wilson. (1994). "Trends in Tolerance toward Rightist and Leftist Groups,
1976-1988." Public Opinion Quarterly, Vol. 58, no. 4, pp. 539-556.
| David O. Sears & Nicholas A. Valentino. (1997). "Politics Matters: Political Events as
Catalysts for Preadult Socialization." American Political Science Review, Vol. 91, no. 1,
pp. 45-65.
| Frank P. Zinni, Jr., Franco Mattei, & Laurie A. Rhodebeck. (1997). "The Structure of
Attitudes toward Groups: A Comparison of Experts and Novices." Political Research
Quarterly, Vol. 50, no. 3, pp. 595-626.
| James H. Kuklinski, Paul M. Sniderman, Kathleen Knight, Thomas Piazza, Philip E.
Tetlock, Gordon R. Lawrence, & Barbara Mellers. (1997). "Racial Prejudice and
Attitudes toward Affirmative Action." American Journal of Political Science, Vol. 41, no.
2, pp. 402-419.
| Kevin B. Smith & Kenneth J. Meier. (1995). "Public Choice in Education: Markets and
the Demand for Quality Education." Political Research Quarterly, Vol. 48, no. 3, pp.
461-478.
| Michael L. Roberts, Peggy A. Hite, & Cassie F. Bradley. (1994). "Understanding
Attitudes toward Progressive Taxation." Public Opinion Quarterly, Vol. 58, no. 2, pp.
165-190.
| David O. Sears, Colette Van Laar, Mary Carrillo, & Rick Kosterman. (1997). "Is it
Really Racism? The Origins of White Americans' Opposition to Race-Targeted Policies."
Public Opinion Quarterly, Vol. 61, no. 1, pp. 16-53.
| John R. Hibbing and Elizabeth Theiss-Morse, "Process Preferences and American
Politics: What People Want Government to Be,"
American Political Science Review, Vol. 95, no. 1,
pp. 145-154.
| James L. Gibson and Amanda Gouws. (2000). "Social Identities and Political Intolerance:
Linkages Within the South African Mass Public" American Journal of Political Science,
Vol. 44, no. 2, pp. 278-292.
| Kenneth W. Terhune, "Nationalism among Foreign and American Students: An Exploratory Study,"
Journal of Conflict Resolution, Vol. 8, No. 3, (September, 1964), pp.
| Paul Goren (2002). "Character Weakness, Partisan Bias, and Presidential
Evaluation," American Journal of Political Science, Vol. 46,
no. 3, pp. 627-641.
| Michael Shamir and John Sullivan, "The Political Context of Tolerance:
The United States and Israel," American Political Science Review,
Vol. 77, no. 4, pp.
| M. Peffley and P. Knigge, and J. Hurwitz. (2001). "A Multiple Values Model of Political
Tolerance," Political Research Quarterly, Vol. 54, pp. 379-406.
| Jack Citrin and Enrst B. Haas. (1994). "Is American Nationalism Changing? Implications
for Foreign Policy," International Studies Quarterly, Vol. 38, pp.
| William D. Berrry, Evan J. Ringquist, Richard C. fording and Russell L. Hanson, "Measuring
Citizens and Government Ideology in the United States, 1960-93,"
American Journal of Political Science, Vol. 42, No. 1, pp. 327-348.
| John A. Clark. (1991). "I'd Rather Switch than Fight: Lifelong Democrats and Converts to
Republicanism among Campaign Activists,"
American Journal of Political Science, Vol. 35, pp. 577-597.
| John L. Sullivan and George E. Marcus. (1998). "A Note on Trends in Political ‘Tolerance',"
Public Opinion Quarterly, Vol. 52, No. 1, pp. 26-32.
| Robert Lerner, Althea K. Nagai, and Stanley Rothman, "Marginality and Liberalism among Jewish
Elites," Public Opinion Quarterly, Vol. 53, No. 3, (Autumn, 1989), pp.
| Week 3 | Opinion and Action. | February 6.
| i. Erikson & Tedin, Chapters 6, 7, 9.
| Last day to add or drop February 7.
| ii. Supplemental Readings:
| Sidney Verba, Kay Lehman Schlozman, Henry Brady, & Norman Nie. (1993). "Citizen
Activity: Who Participates? What do they Say?" American Political Science Review, Vol.
87, no. 2, pp. 303-318.
| Joseph Gershtenson, "Partisanship
and Participation in Political Campaign
Activities, 1952-1996," Political Research
Quarterly, Vol. 55, No. 3, (2002), pp. 687-714.
| Adrian D. Pantoja, Ricardo Ramirez, and Gary
M. Segura, "Citizens by Choice, Voters by
Necessity: Patterns of Political Mobilization
by Naturalized Latinos," Political Research
Quarterly, Vol. 54, No. 4, (2001), pp. 729-750.
| Allan Cigler and Mark R. Joslyn, "The
Extensiveness of Group Membership and
Social Capital: The Impact of Political
Tolerance Attitudes," Political Research
Quarterly, Vol. 55, No. 1, (2002), pp. 7-26.
| Michael A. Jones-Correa and David L. Leal,
"Political Participation: Does Religion
Matter?" Political Research Quarterly,
Vol. 54, No. 4, (2001), pp. 751-770.
| James L. Gibson. (1992). "The Political Consequences of Intolerance: Cultural
Conformity and Political Freedom." American Political Science Review, Vol. 86, no. 2,
pp. 338-356.
| C. Richard Hofstetter and William A. Schultze. (1989). "Some Observations about Participation
and Attitudes among Single Women: Inferences Concerning Political Translation,"
Women and Politics, Vol. 9, No. 1, pp. 83-105.
| Claudine Gay (2002). "Spirals of Trust? The Effects of Descriptive Representation on the
Relationship between Citizens and Their Government,"
American Journal of Political Science, Vol. 46, no. 4, pp. 717-732.
| Martin Gillins. (1996). "'Race Coding' and White Opposition to Welfare." American
Political Science Review, Vol. 90, no. 3, pp. 593-604.
| C. Richard Hofstetter, Thomas G. Sticht, and Carolyn Huie Hofstetter. (1999).
"Knowledge, Literacy, and Power." Communication Research, Vol. 26, No. 1, pp. 58-80.
| Laura Stoker & M. Kent Jennings. (1995). "Life-Cycle Transitions and Political
Participation: The Case of Marriage." American Political Science Review, Vol. 89, no. 2,
pp. 421-436.
| Allyson L. Holbrook, Jon A. Krosnick,
Penny S. Visser, Wendi L. Gardner, and John T. Cacioppo (2001).
"Attitudes toward Presidential Candidates and Political Parties: Initial
Optimism, Inertial First Impressions, and a Focus on Flaws,"
American Journal of Political Science, Vol. 45, no. 4, pp. 930-950.
| Richard J. Ellis & Fred Thompson. (1997). "Culture and the Environment in the Pacific
Northwest." American Political Science Review, Vol. 91, no. 4, pp. 885-897.
| Kira Sanbonmatsu (2002). "Gender Stereotypes and Vote Choice,"
American Journal of Political Science,
Vol. 46, no. 1, pp. 20-34.
| Barbara Norrander. (1999). "The Evolution of the Gender Gap,"
Public Opinion Quarterly, Vol. 63, No. 4, pp. 566-576.
| Carol Kennedy Chaney, R. Michael Alvarez, & Jonathon Nagler. (1998). "Explaining the
Gender Gap in U.S. Presidential Elections: 1980-1992." Political Research Quarterly,
Vol. 51, no. 2, pp. 311-340.
| Jeffrey Koch. (1998). "The Perot Candidacy and Attitudes toward Government and
Politics." Political Research Quarterly, Vol. 51, no. 1, pp. 141-153.
| David G. Lawrence. (1994). "Ideological Extremity, Issue Distance, and Voter Defection."
Political Research Quarterly, Vol. 47, no. 2, pp. 397-422.
| James A. McCann. (1997). "Electoral Choices and Core Value Change: The 1992 Presidential
Campaign." American Journal of Political Science>, Vol. 41, no. 2, pp. 564-583.
| Michael L. Gross. (1995). "Moral Judgement, Organizational Incentives and Collective
Action: Participation in Abortion Politics."
Political Research Quarterly, Vol. 48, no. 3, pp. 507-534.
| Eileen McDonagh (2002). "Political Citizenship and Democratization: The Gender Paradox,"
American Political Science Review, Vol. 96, no. 3, pp. 535-552.
| Alan D. Monroe. (1998). "Public Opinion and Public Policy, 1980-1993."
Public Opinion Quarterly, Vol. 62, no. 1, pp. 6-28.
| Geoffrey C. Layman and Thomas M. Carsey. (1998). "Why Do Party Activists Convert? An
Analysis of Individual-Level Change on the Abortion Laws."
Political Research Quarterly, Vol. 51, No. 3, pp. 723-750.
| Daron Shaw, Rodolfo O. de la Garza, and Jongho Lee. (2000). "Examining Latino
Turnout in 1996: A Three-State, Validated Survey Approach."
American Journal of Political Science, Vol. 44, No. 2, pp. 338-346.
| C. Richard Hofstetter. (1998). "Political Talk Radio, Situational Involvement,
and Political Mobilization." Social Science Quarterly, Vol. 79, No. 2, pp. 273-286.
| Andrea Louise Campbell (2002). "Self-Interest, Social Security, and the Distinctive
Participation Patterns of Senior Citizens,"
American Political Science Review, Vol. 96, no. 3, pp. 565-574.
| Alan S. Gerber and Donald P. Green. (2000). "The Effects of Canvassing, Telephone
Calls, and Direct Mail on Voter Turnout: A Field Experiment,"
American Political Science Review, Vol. 94, no. 3, pp. 653-663.
| Benjamin Highton and Raymond E. Wolfinger. (2001). "The First Seven Years of the
Political Life Cycle," American Journal of Political Science, Vol. 45, No. 1, pp. 202-209.
| James L. Gibson. (2001). "Social Networks, Civil Society, and the Prospects for
Consolidating Russia's Democratic Transition," American Journal of Political Science,
Vol. 45, No. 1, pp. 51-68.
| Ted Brader and Joshua A. Tucker. (2001). "The Emergence of Mass Partisanship in Russia,
1993-1996," American Journal of Political Science, Vol. 45, No. 1, pp. 51-68.
| Diana C. Mutz (2002). "The Consequences of Cross-Cutting Networks for Political Participation,"
American Journal of Political Science, Vol. 46, no. 4, pp. 838-855.
| Jan E. Leighley. (1995). "Attitudes, Opportunities and Incentives: A Field Essay on
Political Participation." Political Research Quarterly, Vol. 48, no. 1, pp. 181-210.
| David G. Lawrence. (1994). "Ideological Extremity, Issue Distance, and Voter
Defection." Political Research Quarterly, Vol. 47, no. 2, pp. 397-422.
| James A. McCann. (1997). "Electoral Choices and Core Value Change: The 1992
Presidential Campaign." American Journal of Political Science, Vol. 41, no. 2, pp. 564-
583.
| Michael L. Gross. (1995). "Moral Judgement, Organizational Incentives and Collective
Action: Participation in Abortion Politics." Political Research Quarterly, Vol. 48, no. 3,
pp. 507-534.
| Alan D. Monroe. (1998). "Public Opinion and Public Policy, 1980-1993." Public
Opinion Quarterly, Vol. 62, no. 1, pp. 6-28.
| Geoffrey C. Layman and Thomas M. Carsey. (1998). "Why Do Party Activists Convert?
An Anlaysis of Individual-Level Change on the Abortion Laws," Public Opinion Quarterly, Vol. 51, No. 3, pp. 723-
750.
| C. Richard Hofstetter. (1998). "Political Talk Radio, Situational Involvement, and
Political Mobilization." Social Science Quarterly, Vol. 79, No. 2, pp. 273-286.
| Daron Shaw, Rodolfo O. de la Garza, and Jongho Lee. (2000). "Examining
Latino Turnout in 1996: A Three-State, Validated Survey Approach."
American Journal of Political Science, Vol. 44, No. 2, pp. 338-346.
| Week 5 | Opinion and Communication Media. | February 9
| i. Erikson & Tedin, Chapters 8, 11, 12.
| ii. Supplemental Readings:
| Larry M. Bartels. "Messages Received: The Political Impact of Media." American
Political Science Review, Vol. 87, no. 2, pp. 267-285.
| Anibal Perez-Linan, "Television News and Political Partisanship in
Latin America," Political Research Quarterly, Vol. 55,
No. 3, (2002), pp.571-588.
| Lonna Rae Atkeson and Randall W. Partin, "Candidate
Advertisements, Media Coverage, and Citizen Attitudes:
The Agendas and Roles of Senators and Governors in
a Federal System," Political Research Quarterly,
Vol. 54, No. 4, (2001), pp.795-813.
| Stephen Ansolabehere, Shanto Iyengar, Adam Simon, & Nicholas Valentino. (1994).
"Does Attach Advertising Demobilize the Electorate?" American Political Science Review,
Vol. 88, no. 4, pp. 829-838.
| Thomas E. Nelson, Rosalee A. Clawson, & Zoe M. Oxley. (1997). "Media Framing of a
Civil Liberties Conflict and Its Effect on Tolerance." American Political Science Review,
Vol. 91, no. 3, pp. 553-566.
| Richard R. Lau and Gerald M. Pomper (2002). "Effectiveness of Negative Campaigning in U.S.
Senate Elections," American Journal of Political Science, Vol. 46, no.1, pp. 47-66.
| Bruce Bimber. (2001). "Information and Political Engagement in America: The
Search for
Effects of Information Technology at the Individual Level." Political Research
Quarterly,
Vol. 54, no. 1, pp. 53-67.
| Benjamin I. Page & Jason Tannenbaum. (1996). "Populistic Deliberation and Talk
Radio." Journal of Communication, Vol. 46, no. 2, pp. 33-54.
| C. Richard Hofstetter & Christopher L. Gianos. (1997). "Political Talk Radio: Actions
Speak Louder than Words." Journal of Broadcasting and Electronic Media, Vol. 41, No.
4, pp. 501-515.
| Lee Sigelman, Susan Welch, Timothy Bledsoe, & Michael Combs. (1997). "Police
Brutality and Public Perceptions of Racial Dis crimination: A Tale of Two Beatings."
Political Research Quarterly, Vol. 50, no. 4, pp. 777-791.
| Patrick J. Kenney & Tom W. Rice. (1994). "The Psychology of Political Momentum."
Political Research Quarterly, Vol.47, no.4, pp. 923-938.
| C. Richard Hofstetter, Mark C. Donovan, Melville R. Klauber, Alexandra Cole, Carolyn
J. Huie, & Toshiyuki Yuasa. (1994). "Political Talk Radio: A Stereotype Reconsidered."
Political Research Quarterly, Vol. 47, no. 2, pp. 467-480.
| Craig Leonard Brians & Martin P. Wattenberg. (1996). "Campaign Issue Knowledge and
Salience: Comparing Reception from TV Commercials, TV News, and Newspapers."
American Journal of Political Science, Vol. 40, no. 1, pp. 172-193
| C. Richard Hofstetter and David Barker, with James T. Smith, Gina M. Zari, and Thomas
A. Ingrassia. (1999). "Information, Misinformation, and Political Talk Radio." Political
Research Quarterly, Vol. 52, No. 2, pp. 353-369.
| John R. Hibbing & Elizabeth Theiss-Morse. (1998). "The Media's Role in Public
Negativity toward Congress: Distinguishingy Emotional Reactions and Cognitive
Evaluations." American Journal of Political Science, Vol. 42, no. 2, pp. 475-498.
| Kim Fridkin Kahn and Patrick J. Kenney (2002. "The Slant of the News: How Editorial
Endorsements Influence Campaign Coverage and Citizens' Views of Candidates,"
American Political Science Review, Vol. 96, no. 2, pp. 381-394.
| W. Russell Neuman. (1990). "The Threshold of Public Attention." Public Opinion
Quarterly, Vol. 54, no. 2, pp. 159-176.
| Stephen Ansolabehere & Shanto Iyengar. (1994). "Riding the Wave and Claiming
Ownership over Issues: The Joint Effects of Advertising and News Coverage in
Campaigns." Public Opinion Quarterly, Vol. 58, no. 3, pp. 335-357.
| Tara M. Emmers-Sommer and Mike Allen. (1999). "Surveying the Effect of Media
Effects: A Meta-Analytic Summary of the Media Effects Research in Human
Communication Research." Human Communication Research, Vol. 25, No. 4, pp. 478-
497.
| Juliette H. Walma Van Der Molen and Tom H. A. Van Der Voort. (2000). "The Impact
of Television, Print, and Audio on children's Recall of the News." Human
Communication Research, Vol. 26, No. 1, pp. 3-26
| William J. Schenck-Hamlin, David E. Procter, and Deborah J. Rumsey. (2000). "The
Influence of Negative Advertising Frames on Political Cynicism and Political
Accountability," Human Communication Research, Vol. 26, No. 1, pp. 53-74.
| C. Richard Hofstetter and David Barker, with James T. Smith, Gina M. Zari, and
Thomas A. Ingrassia. (1999). "Information, Misinformation, and Political Talk Radio."
Political Research Quarterly, Vol. 52, No. 2, pp. 353-369.
| Paul M. Kellstedt. (2000). "Media Framing and the Dynamics of Racial Policy Preferences."
American Journal of Political Science, Vol. 44, pp. 245-260.
| Juliette H. Walma Van Der Molen and Tom H. A. Van Der Voort. (2000). "The
Impact of Television, Print, and Audio on children's Recall of
the News." Human Communication Research, Vol. 26, No. 1, pp. 3-26
| William J. Schenck-Hamlin, David E. Procter, and Deborah J. Rumsey. (2000).
"The Influence of Negative Advertising Frames on Political Cynicism and
Political Accountability," Human Communication Research, Vol. 26, No. 1,
pp. 53-74.
| Joanne M. Miller and Jon A. Krosnick. (2000). "News
Media Impact on the Ingredients of Presidential Evaluations:
Politically Knowledgeable Citizens are Guided by a Trusted Source."
American Journal of Political Science, Vol. 44, pp. 301-315.
| Diana C. Mutz and Paul S. Martin. (2001). "Facilitating Communication
across Lines of
Political Difference: The Role of Mass Media," American Political
Science Review, Vol. 95,
no. 1, pp. 97-114.
| Robert Huckfeldt, John Sprague, and Jeffrey Levine. (2000). "The
Dynamics of Collective
Deliberation in the 1996 Election: Campaign Effects
on Accessibility, Certainty, and
Accuracy," American Political Science
Review, Vol. 94, no. 3, pp. 641-651.
| David C. Barker. (1998). "Rush to action: Political talk radio
and health care (Un) reform,"
Political Communication. Vol. 15, no. 1, pp. 83-98.
| David C. Barker. (1999). Rushed decisions: Political talk
radio and vote choice, 1994-1996,"
Journal of Politics. Vol. 61, no. 2, pp. 527-539.
| Week 6 | Opinion and Information Processing. | February 16
| i. Supplemental Readings:
| James A. Stimson, Michael B. Mackuen, & Robert S. Erikson. (1995). "Dynamic
Representation." American Political Science Review, Vol. 89, pp. 543-565.
| Bruce Bimber, "Information and Political Engagement
in America: The Search for Effects of Information
Technology at the Individual Level,"
Political Research Quarterly, Vol. 54,
No. 1, (2001), pp.53-67.
| David E. Campbell, "The Young and the Realigning: A Test of the Socialization
Theory of Realignment," Public Opinion Quarterly, Vol. 66, No. 2, 2002, pp. 209-234.
| Nicholas A. Valentino, "Crime News and the Priming of Racial Attitudes
During the Evaluations of the President," Public Opinion Quarterly, Vol. 63,
No. 3, 1999, pp. 293-320.
| Jeffrey W. Koch, "When Parties and Candidates Collide: Citizen Perception of House
Candidates' Positions on Abortion," Public Opinion Quarterly, Vol. 65, No. 1, 1999, pp. 1-21.
| Richard R. Lau & David P. Redlawsk. (1997). "Voting Correctly." American Political
Science Review, Vol. 91, no. 3, pp. 585-598.
| Milton Lodge & Marco R. Steenbergen, with Shawn Brau. (1995). "The Responsive
Voter: Campaign Information and the Dynamics of Candidate Evaluation." American
Political Science Review, Vol. 89, no. 2, pp.309-326.
| Jacob Shamir & Michal Shamir. (1997). "Pluralistic Ignorance across Issues and over
Time." Public Opinion Quarterly, Vol. 61, no. 2, pp. 227-260.
| M. Kent Jennings. (1996). "Political Knowledge over Time and across Generations."
Public Opinion Quarterly, Vol. 60, no. 2, pp. 228-252.
| C. Richard Hofstetter and David Barker, with James T. Smith, Gina M. Zari, and Thomas
A. Ingrassia. (1999). "Information, Misinformation, and Political Talk Radio." Political
Research Quarterly, Vol. 52, No. 2, pp. 353-369.
| James L. Gibson (2001). "Social Networks, Civil Society, and
the Prospects for Consolidating Russia's Democratic Transition."
American Journal of Political Science, Vol. 45, No. 1, pp. 51-68.
| Week 7 | Design. | February 23
| i. Erikson & Tedin, Chapter 2.
| Week 8 | Survey Project. Introduction to Interviewing. | March 1
| i. Reading # 5. (Xerox) "Interviewing Instructions."
| March 11
| Week 9 | Survey Project Discussion | March 8
| Week 10 | Holiday--Spring Vacation | March 15
| Week 11 | Data Reduction and Processing. | March 22
| SSRL Workshop for SPSS Windows.
| Week 12 | Data Reduction and Analysis. SSRL Workshop
for SPSS Windows | March 29
| Week 13 | Analysis for Paper Discussion. | April 5
| Brief Analysis Report Due
| Week 14 | Analysis for Paper Discussion. | April 12
| Week 15 | Paper Presentations. Submit finished research
papers for a grade. | April 19
| Week 16 | Paper Presentations. Submit finished research
papers for a grade. | April 26
| Week 17 | Paper Presentations. | May 3
| |
Return to top. Two types of papers will be written for this course: 1) A brief
report of analysis of your survey project; and 2) an analysis of
survey data that were collected from a random-digit-dial sample of a cross-section
of the adult San Diego public last year in my classes.
Students will design questionnaires and interview schedule and
complete interviews with 20 different persons of their
own choosing ("convenience" sample). The questionnaire should be designed to
answer a specific research question that interests
students and can be related explicitly to readings in this course. After
administration, students will enter questionnaires into an
SPSS data file and store the file on a floppy disk. Data will be analyzed using
SPSS or a similar computer statistical package, and
the results written in a brief report not to exceed six double-spaced pages.
The brief report should summarize data from your
survey project. It should include an explication of your main hypothesis,
description of subjects interviewed, description of the
measures employed, and presentation of data from a computer analysis of your
data. The paper will be typed in double-spaced
format, and will include a very brief introduction, method section,
presentation of data and explication, and discussion section, and
will be limited to six pages. The completed questionnaires
should be appended to the paper.
Return to top. The more extensive analysis of survey data from San Diego
will involve a computer analysis of the survey data collected
by students during spring, 2000, in San Diego and constitutes the term paper for this class.
The analysis will be based on an appropriately
formulated hypothesis drawn from assigned
readings and other relevant readings in professional social science journals in the library.
Data will be presented from an original
analysis in an appropriate format and interpreted in relation to the hypothesis. The
paper will be typed in double-spaced format,
and will include an introduction, method section, presentation of data and explication,
and discussion section. Each mss. will be
no more than 15 pages in length. This paper should include:
a. An introduction that presents a brief statement of the problem. This statement will
link the hypothesis to research that
other people have published, and will establish a clear trail of logic leading to the hypothesis
to be tested.
b. A method section that describes the source of the data (e.g., a random-digit-dial
sample survey of adults taken in San
Diego during spring, 2000, with a response rate of approximately 50 percent), and the nature
of measures used (e.g., exact wording
of questions and response distributions for each measure).
c. A findings section that includes a restatement of the hypothesis, presentation of
the data, and explication of the data in
terms of the hypothesis. An additional variable should be introduced as a control variable,
the analysis replicated, and results re-interpreted.
d. A discussion that briefly summarizes what has been done, the logic of the study,
the results, and a statement of where further research will be fruitful.
e. A reference section that includes all cited literature in a professionally appropriate format.
It is not possible to receive a passing grade in this class in the absence of a term paper
following the requirements: Papers will also
be graded for content and also for proper grammar and expression and for the use of an
appropriate style (including citation of sources). Late papers will not be accepted.
Students will complete two professional article reviews during the
initial five weeks of the class. Each review will involve
a summary of an article published in a professional journal from the list of
articles in the supplemental readings portion in the
syllabus. Students will present reviews orally to the class and distribute
copies of their paper summarizing the review. Papers should not
be read, but presentations may utilize brief notes.
Presentations will be limited to no more than 12 minutes, and written papers
limited to four typed double spaced pages . Reviews
will include:
a. Brief summary of the theory structuring the research. What is the
paper about? What is believed to cause what? What
are the major concepts? How are the concepts defined? What are the main
hypotheses tested in the research?
b. Summary of the data and measurements on which analyses are drawn.
What is the nature of the data? How and when
were the data collected? What are the operational definitions of key concepts/variables?
c. Describe briefly the nature of the data analysis. In general, how was analysis conducted?
d. What are the major conclusions of the research? What conclusions
do you draw from the analysis about the theory on
which the research was based? What criticisms of the research do you have?
Interviewing is a professional activity that encompasses heavy responsibilities. It is most important to remember that the survey
in which you are participating is an extremely serious matter being used to take systematic observations from which to build
knowledge that will help people live healthier lives as well as to inform government agencies about health issues being studied in the
survey. The overall results will depend on the quality of your work.
Several types of interview are commonly used, including:
1) Unstructured interviews: Interviews in which the person asking questions has a general idea of topics to be covered by
questions but adapts the specific questions asked depending on what the respondent says and is free to word questions in
ways that the interviewer believes will best indicate the information that is wanted. Examples: Journalistic interviews,
some kinds of psychiatric interviews, many exploratory interviews.
2) Structured interviews: Interviews in which the wording and order of questions to be asked is completely determined.
Interviewers have little or no latitude to alter wording or question order. Examples: Public opinion polls, many scientific
surveys, standardized testing.
3) Partially structured interviews: Combination of unstructured and structured interviews.
The Introduction.
The interview that most polling firms conduct is called completely structured, since you are to read all questions as written and
in the order written. Such interviews are designed to insure comparability among the responses to different interviewers by different
respondents. Words which appear in normal print are to be read as written. Words which appear entirely in CAPITAL LETTERS
or BOLDED LETTERS are instructions to interviews and should not be read to respondents.
If a respondent cannot understand a word or phrase, then repeat the sentence. If the sentence is still not understood, an
interviewer may modify the wording to aid respondent in answering, but must make note of the changed wording in the margin of the
survey adjacent to the item changed.
The first portion of the interview, the introduction, is designed to initiate an interview with the appropriate respondent in a
household contacted through the sampling procedure being used. The introduction is to be delivered as worded and functions to:
1) Introduce the interviewer,
2) Introduce the institution,
3) Give a very briefly introduces the survey purpose,
4) Explain that the survey is confidential and that responses will be used only in statistical analyses of groups of respondents,
5) Select the appropriate respondent to be interviewed in the household,
6) And gain passive informed consent to conduct the interview.
According to the sampling procedure used, interviewers are guided to the correct respondent, for instance, the adult (person 18
years or older) in the household with the most recent birthday. The interviewer then either proceeds as rapidly as possible to do the
interview once the targeted respondent is contacted or finds out when the targeted respondent will be available for an interview. In
the latter case, the interviewer will arrange a specific date and time to contact the targeted respondent.
Representative Objectives, Comments, and Suggested Responses.
Many questions arise concerning the survey when household contacts are initiated. Representative questions and possible
answers include:
"Where did you get this number?" Your number was generated by computer using a scientifically
developed procedure to represent all Californians with residential telephones.
"Why are you doing this survey?" We are trying to find out what people in this area really think about a number of
important political issues so that we can provide information that may improve our ability to teach students what is going
on in the state.
"Who do you represent?" I am an interviewer in a class at San Diego State University. We are trying to find out about
citizens' political and social opinions and you can help me a great deal.
"What are you going to do with what I say?" Your response will be changed to numbers and put into a computer along
with all other persons so that no one will be able to match your answers with you personally. We will then do statistical
analyses to see what general trends exist in the state on our questions.
"I don't know anything about issues (or anything else)!" That is all right. Everyone's views are important to us even if they
have no opinion on some issues, since it is important for us to know how many people do not have opinions on some
issues.
"Who is going to see what I say?" No one except staff persons on the project. Your identity will be kept in the strictest
confidence so that no one will ever match what you tell us with you personally.
"Why do you need my views?" OR "I just do not want to participate!" It is very important for the scientific validity of the
survey that we get responses from all subjects. Our sample is accurate only if all people surveyed agree to be interviewed
for the short time it will take. Other people have been very helpful, and I certainly need your help for the success of our
class project.
"Who is paying you to do this?" No one, I am a student doing a class assignment at San Diego State University.
"What are you selling?" Nothing! This is a SDSU research project, and your help will make this a more successful project
and help us to provide more accurate information about Californians' political views.
"I cannot continue!" "I have to go now!" The survey just will take a few more minutes of your time. It will be most
helpful to give me a few more minutes of your time. We have finished most of the interview; it will take just a few more
minutes. I'd really appreciate your help for just a few more minutes. IF ALL ELSE FAILS, My instructions permit me to
finish this at another time. What would be the best time for you? When may I finish this up? (Make a specific
appointment to call back and to complete the interview.)
In general, the more interviewers emphasize positive symbols, the higher the rate of cooperation likely. Positive symbols include:
"science," "San Diego State University," "help to improve the what we teach students about people in California," "help the state of
California do a better job in making laws to protect the our nation,""do a favor for me," and "help me personally to do well in my
job."
The Body of the Survey.
The first few questions are extremely important since most respondents will agree to complete interviews once they have been
asked several questions. The first few questions reinforces the legitimacy of the survey and raises the interest of respondents. Thus,
once interviewers get the correct respondent, they should move directly and quickly to ask the rest of the questions.
In telephone interviewing, it is also important to move deliberately and quickly through all questions (without over-pressuring
the respondents). Do not allow respondents to wander or to become overly verbose. Focus questions, if necessary with probes
(neutral questions to focus attention on what you want to find out). If necessary, wait until a respondent takes a breath and then
quickly re-ask a question or move on to the next question.
Be sure to follow the proper sequencing of questions in the survey (the order in which items appear on the protocol). Surveys
will sometimes use filter questions to guide the sequence in which questions are to be asked. For instance, if a respondent says that
he/she has never smoked cigarettes the interviewer will be instructed not to ask whether that respondent has quit smoking or whether
that respondent feels that he/she is able to quit smoking. The latter questions are not relevant to persons who have never smoked and
may lead to irritation and loss of rapport during the rest of the interview. In contrast, all respondents will be asked how many of their
friends smoke cigarettes since both smokers and non-smokers may have friends who smoke.
Occasionally a respondent may not want to respond to an item. Interviewers must judge whether to repeat the question or not in
order to finish the protocol. It is more important to obtain responses to all the questions in the remainder of the survey than to
lose an interview. If a respondent appears to be on the verge of terminating over an item, move directly on to the next question in the
protocol and complete the survey. When all of the items have been asked at the end of the interview, just say something like the
following: "Now I need to check to see whether I have correctly asked you all of the questions on the survey. Oh, yes! Here is an
item." Then repeat the question as if no problem had existed, thank the respondent, and terminate the interview normally. In many
cases, the respondent will provide an answer to the omitted question.
There are two kinds of questions in most surveys:
1) Closed-ended questions: Questions that include the proper responses for the question. Example: "Would you say you
agree strongly, agree, disagree, or disagree strongly that smoking cigarettes is bad for your health?"
2) Open-ended questions: Questions that do not enumerate the proper responses for the question. Example: "What would
you say are the most important health problems facing most people of Korean descent in California today?"
Closed-ended questions are items with all responses read to the respondents or for which responses are provided but not read.
Simply ask the item, and mark the appropriate response by circling it. If respondents do not give the correct type of answer, then
repeat the question along with response categories, pause, and wait for the response. If the respondent cannot give the correct type
of response, note what was said in the margin of the protocol and go on to the next item without circling one of the categories. It is
the interviewer's responsibility to insure that coders understand unambiguously what the specific response is to all questions!
Open-ended questions are free response items, questions that are asked and respondents give spontaneous answers. They are
typically questions followed by a blank for writing the response. Note that the response is to be written, with the meaning preserved
to your best ability. Usually you should write exactly what the respondent says. Responses, however, may occasionally require
paraphrasing what the respondent actually said when responses are long or complex. It is the interviewer's responsibility to annotate
the questions in margins so that complete and accurate meaning of responses is preserved. Annotation should be checked and
clarified as soon as interviewing is complete.
Probes are quite commonly employed in order to clarify the meaning of responses to open-ended questions, obtain more
complete information, and to obtain second and third responses to the same question where requested. A probe is a neutrally worded
prompt designed to elicit further responses. Probes might include: "Can you tell me more about that?" "Is there anything else?"
"Anything else?" "What else?" A brief, silent pause also frequently worked to elicit further response.
Matrix or battery questions involve a series of closed-ended questions using the same responses and asked sequentially. An
introduction includes a general question, followed by the response categories, followed finally by items. Sometimes the response
categories are forgotten so that interviewers must repeat them. Repeating the response categories is an appropriate prompt whenever
required. Most respondents learn appropriate responses in battery formatted sets of questions after the first one or two items.
Forms and Materials.
Immediately when an interview is completed and calling terminated, interviewers should check every item to make sure that each
is marked appropriately. Be especially careful to insure that:
1) One and only one response is marked for each item,
2) Written information is easily legible,
3) No item has been omitted (if a question has been omitted, call back the respondent immediately, apologize, ask the
question, and thank the respondent). Of course, if a respondent has refused to answer a question after appropriate follow
up, the interviewer should mark "NA" for no answer and not re-contact the respondent.
4) All questions have a response marked appropriately.
The disposition of each and every call is to be recorded using codes at the bottom of calling sheets in the row corresponding to
the number called (SEE ATTACHED BLANK CALLING SHEET). One mark must be made for each call to each number. If an
interview is completed on the first call, for instance, then mark"CO" under the heading "Initial Call" for the number. If no one
answers on the initial call, the number is busy on the second call, and the interview is actually completed on the third call, then the call
sheet would be marked: NA, BU, and CO under the columns labeled "initial call," First Call back," and "Second Call back,"
respectively. Once completed, the calling sheets will provide the history of interviewing attempts for each number in the sample.
Outcomes and codes include:
NK: Ineligible. No eligible persons are in the household,
CO: Interview completed,
REF: Interview refused, respondent hangs up without replying,
LA: Cannot determine language, but not English,
MA: Answering machine. Leave a message with your name, number, and time for respondent to contact interviewer,
but interviewer should also call back at next available chance (since most respondents will not return calls),
BU: Line is busy,
CB: Call back,
APT: Appointment to call back,
NA: No answer, refused to answer question,
DIS: Number disconnected, or
IN: Ineligible number (Business or Institution), not a residence.
Interviewers should cycle through numbers from the top of each calling sheet to the bottom, dialing one number at a time and
moving on to the next as an interview is completed, refused, a busy signal received, a machine answers, an ineligible number is
contacted (e.g., a business), a household with no eligible persons has been called, or some other disposition has been made for the
number. Note in small print beside the code, the day and time of the call. After the end of the list is reached, simply cycle through the
list a second time, on a different day and time of day, calling each number again. Do not redial numbers at which respondents have
refused, are numbers not belonging to residences, or numbers which do not exist (but nonetheless ring).
The best interviewer will be used as a "conversion interviewer." This person will re-contact respondents who refuse to be
interviewed initially following a two week interval. The conversion interviewer will make a specific attempt to complete an interview
for those marked REF.
It is imperative that no materials are lost during the interviewing process. All interview protocols, uncompleted as well as
completed, calling lists, instructions, and any other materials associated with the survey should be returned to the supervisor as soon
as possible. All items belong to the project and are to be returned at the end of interviewing.
The field supervisor is the key person during
the fieldwork phase of a survey project. The supervisor is responsible for getting
the interviewing done and done correctly. The supervisor is also responsible for seeing that pertinent information is communicated to
the principal investigator by e-mail and/or telephone as appropriate.
More specifically, the field supervisor:
Oversees the processing of the sample of telephone numbers is drawn and follows the sampling frame,
Hires, fires, and trains interviewers,
Oversees the quality of interviewers' work, and provides appropriate correction of errors in interviewing in real time (as
errors occur),
Completes a daily work report for each interviewer who works that day that summarizes the number of interview attempts,
completions, and non-completions (and the type of non-completion) for that interviewer.
Completes a weekly work report for the project that week that summarizes the number of interview attempts, completions,
and non-completions (and the type of non-completion) for the project that week.
Edits all interviews for errors (including items which are not marked, items that have more than one response marked,
illegible markings, and other irregularities. These errors are to be corrected in consultation with the interviewer at the
earliest possible time.
Oversees the entry and verification of responses into computer files,
Oversees the production of all forms, including the interview schedules, calling sheets, recording the disposition of
numbers during interviewing, and interviewer performance records,
Insures that interviews are conducted in a timely fashion and that all project deadlines are met,
Reports progress of survey fieldwork each Thursday on a weekly basis the principal investigator by e-mail,
Reports problems to the principal investigator by e-mail and/or telephone as problems arise,
Transmits cumulative data files as interviews are converted to machine readable form by e-mail to the principal
investigator each Thursday on a weekly basis.
Social scientists design probability samples so that a small number of people can be used to represent the characteristics of very
large, socially significant well defined populations accurately. Probability samples are the only type of samples that permit the
computation of error in estimating population parameters (characteristics of the population) from sample statistics (characteristics of
the sample). No other type of sample justifies statistical testing of hypotheses!
1) A probability sample involves the selection of a specific number of people so that the probability that any single
individual being included in the sample is known or can be computed with precision.
2) A simple random sample is a probability sample in which the probabilities of selecting any individual in the population
is the same.
The probabilistic character of sampling depends on following a set of procedures strictly. The procedures involve selection of
cases in a mathematically random way. If the procedures are not followed, then the sample is not a probability sample and statistical
inference is not possible. For probability sampling the following are required:
1) Must have a well defined population. This requires precise knowledge of the population from which the sample is to
be taken, including precise definition of ages, locations (adults 18 or older), relationships (anyone who resides regularly
in the household), immigration status (Korean nationals, not foreigners), and types of dwelling units (private homes and
apartments that have residential telephones, but not institutional residents) in Seoul.
2) Must have a well defined procedure for selection. Procedure includes two selection processes: 1) Random selection of
telephone numbers from listings in Seoul telephone directory; 2) random selection of persons within households.
Imperative to interview the person who is selected by the procedure. No substitutions allowed.
3) Must have high completion rates for interview attempts. Great effort must be expended to complete interviews with all
eligible persons contacted.
4) Must present standard stimuli. Interview is completely structured and items are to be read verbatim and in order given
in the survey (with very few exceptions). Any exceptions must be documented thoroughly and may result in
disallowance of the interview in the study.
We use a form of probability sampling in this study in Seoul called random-digit-dialing (RDD). RDD begins with a random
selection of 5,000 telephone numbers in Seoul from the best available listing of telephone subscribers for the city. Selection of
telephone numbers and development of the final sample involves three steps:
1) Select pages in telephone directory from which numbers are to be drawn, called targeted pages.
2) Select a cluster of numbers from each selected page in telephone directory, called targeted numbers.
3)Add a constant to the last portion of the listed telephone number and record the final result on the calling sheet.
Targeted pages are selected by first determining the page numbers of the first and last pages of residential listings in the best
available telephone directory for Seoul. We will then generate 1,000 random numbers between the first and last pages by computer.
Numbers are to be read row by row beginning at the top of the page from left to right. The first number selects the first page to be
included in the sample, the second number selects the second page to be included in the sample, etc.
For example, look at the below table of "Selected Random Numbers, 1-975." Random numbers between 1 and 975 were
generated to match a recent San Diego telephone directory with residential listings on pages 1-975. Note that the table presents 10
rows of 15 random numbers each. The first random number is 134 so that the first page in the directory selected will be page 134.
The second random number specifies that page 358 will be selected, the third number specifies that page 201 will be selected, the
third number that page 7 will be selected, etc.
Misogyny and Political Talk Radio
C. Richard Hofstetter
Department of Political Science
594-6244
1. Study Abstract: This is a survey study of a cross-section of the adult public residing in San Diego that can be reached by
residential telephone (96% of all residences). About 500 adults selected randomly are to be interviewed by telephone about
their attitudes and use concerning intergroup relations and misogynous attitudes using a standard interview format. The
interviewing will be conducted by students in the investigator's classes following training in survey methodology as an integral
part of their classroom instruction involving the theory and methods of political behavior research. Potential risks to
respondents involve minor irritation with question content as is normally encountered in literally all survey research. Risk
beyond that does not exist, since all names and telephone numbers will be used to validate that interviewing was actually
conducted as stated by students and then will be destroyed. In no case will telephone numbers, names, or addresses be attached
to data files so that the identity of all respondents is absolutely protected. All student interviewers will be given instruction in
the methods of survey interviewing so as to minimize irritation and in the ethics of survey research to maintain confidentiality.
2. The purpose of the survey is, first, to provide training in the methods of survey data collection in a political behavior class,
second, to provide data for student theses and term papers, and, third, for possible publication with students. Human behavior
is the focus of the study, and it cannot be done without asking people questions about their views and behaviors. The main
hypothesis of the study is that exposure to some of the more flamboyant political talk show content is related to the
reinforcement and, possibly, the development of misogynous attitudes in the American political system.
3. Respondents (subjects) will be recruited using normal survey methodology. A random digit dial sample of San Diego
residents will be computed. Adults within households who assent to cooperate with the study will be interviewed by telephone
by students under the supervision of a graduate student. Persons 18 or over who can speak English well enough to participate
in the interview process and who assent to participate will be included. No special groups are included for study. I know of no
specific problems involving subjects for the study, since subjects are free to terminate the interview at any time or to refuse to
participate initially. Only numbers (no names, telephone numbers, or other identifying information will be entered in data files
and all other information identifying respondents will be destroyed once interviewing has been validated. A limited number of
paper and pencil questionnaires may be mailed out to respondents who agree to participate in a follow up survey. As soon as
data have been entered into a computer, all identifying information will be destroyed.
4. Respondents will be recruited from the adult public in San Diego in households that can be reached by residential telephone.
Specific adults will be selected within households using the "most recent birthday" criterion. Subjects will be asked a series of
questions and responses recorded. No experimental procedures are involved. The only experimental aspect of this study is the
gathering of information for the purpose of analysis. No special procedures will be used, and nearly all interviews will be
completed in between 15-25 minutes, depending on rapidity of subject response and skill of interviewer. The study will be in
San Diego. The questionnaire draft is attached. No deception is involved in any way in the study.
5. Benefits to subjects. Most subjects find participation in legitimate surveys of this kind to be entertaining and interesting.
General knowledge may be learned regarding intergroup relations in a democracy, but the main gain is to educate students in
classes about the nature and measurement of public opinion.
6. Minor psychological irritations as noted.
7. Precautions involve training and supervising interviewers. I also separate identifying information from interviews, attach ID
numbers to forms, and destroy the former as soon as interviews are validated. All data are coded in numeric form so that no
alphabetical codes are entered in the data set. No names, addresses, or telephone numbers are entered into the data set.
8. No compensation of subjects will be used in this study.
9. I am a full professor, step IV, who has been conducted surveys since 1966. I have over 150 refereed publications most of
which involve survey research in one form or another.
Analysis Assignments:
Brief Survey Report:
Data Analysis Papers:
Review of Published Articles:
Interviewing:
Supervision:
The Sample:
Example IRB Human Subjects Approval Application: