6(3), August 1989, pages 47-62

The Efficacy of Syntax Checkers on the Quality of Accounting Students' Writing

George S. Peek, Tony Eubanks, Claire May, Patsy Heil

The increased student use of microcomputer-based word-processing programs has generated an interest in whether or not computer aids may improve the quality of both student and "real world" writing and, if so, how. Some studies indicate that using a word-processing program often improves writing because of the ease with which writers may revise compositions or evaluate alternative expressions and structures (Daiute, 1986; Lutz, 1987); some studies do not support this finding (Harris, 1985; Hawisher, 1987). Moreover, computers may also provide other composition aids with additional software, including spelling checkers, on-line thesaurus helpers, outliners, and syntax checkers. This study is concerned with the effect of syntax checkers on the quality of student writing in an intermediate accounting course. The findings indicate that, for the study population, there was no significant difference between the students who had access to the syntax checker and those who did not.

Related Studies

Borthick and Clark (1987) reported that accounting students using a computer-implemented writing aid scored better on two different assignments than students not using the aid. The findings indicated that the student use of writing aids has potential for improving accountants' written communication. In contrast, The Wall Street Journal (7 July 1986) carried a story suggesting that computerized writing aids were not particularly effective in improving the writing of already competent persons; however, the story also suggested that such aids might be of some use to weaker writers by indicating problem areas for revision. Both articles refer to RIGHTWRITER, a readily available "language analyzer" which purports to provide information about the style, grammar, spelling, and readability of a piece of writing.

Reviews of RIGHTWRITER and related syntax-checking software indicate its potential usefulness for many business applications, especially for short memos (Bates, 1984; Cunningham, 1988; Lewis & Lewis, 1987; Mortensen, 1987; Raskin, 1986). However, reviewers are generally reacting to a few simple tests of the software and to the documentation, not to an extensive evaluation of the software in a large-scale implementation. Several empirical studies have been done, using UNIX-based WRITER'S WORKBENCH software developed by Bell Laboratories, which support the potential value of computerized writing aids (Cherry & Macdonald, 1983; Frase, Macdonald, Gingrich, Keenan, & Collymore, 1981; Kiefer & Smith, 1983; Sterkel, Johnson, & Spgren, 1986). However, we are aware of no empirical studies that have been done using commercially available PC-based software.

Communication Skills in Accounting

Accounting is a communication-intensive discipline. Although a common perception is that accountants work simply with numbers, the reality is that accountants communicate the financial data to information users of a firm, providing analysis and interpretation of these data in a variety of contexts and to a varied user group. Users comprise a broad range of people: External users may include stockholders, pension fund managers, bankers, or government agencies; internal users include managers at all levels within the firm, representing all functional business areas. Compared to communication, numbers play a small, albeit crucial, role in the accountant's world.

Accountants must be able to communicate with a wide range of persons: from unsophisticated clients to major corporate or governmental executives, from newly hired subordinates to senior technical partners. The accountant must communicate with others in the firm, with direct and indirect clients, with outside agencies and offices, and with the general public. Writing is central to the accountant's activity, and it is a major concern to practitioners and accounting educators (Andrews & Koester, 1979; Corman, 1986; Gingras, 1987; Ingram & Frazier, 1980; May & Arevalo, 1983; Orem & Burns, 1988); their consensus is that more must be done at all levels to improve communication skills.

Employers are concerned that entry-level personnel cannot communicate well; often, accounting students have not sufficiently practiced writing skills in their undergraduate education, and they often have not received adequate feedback about their writing to recognize problems or to improve the quality of their writing. Practitioners themselves are concerned about their own writing skills and the inefficiency caused by poor communication in the workplace. Employers and practitioners would welcome any means to improve writing skills because demands on time are heavy in the accounting profession and because productivity is enhanced by clear communication. Computerized writing aids would ease the time pressures and might increase clarity of prose by allowing individuals to do more substantive editing of their writing before sending it out to business colleagues and clients.

Accounting educators are painfully aware that more writing practice is needed to help students exercise cognitive, critical, and technical skills; to socialize them into the profession; and to prepare them for the variety of situations they will encounter. Many accounting educators do not feel comfortable making stylistic comments on student papers, though they recognize that something may be wrong; moreover, most do not have time in courses to work specifically with writing skills because of the demands of course content, which increase every year as new requirements and procedures are developed in the business community. Computerized writing aids have the potential to allow accounting educators to require more writing by shifting some of the burden of evaluation to the computer.

Method

The purpose of the current study was to investigate any discernible improvement in student writing with an available computer writing aid. External intervention in the writing process is often beneficial to writers; one example of a type of computer-based intervention system is the "syntax checker," previously mentioned. Other types of intervention, especially peer review, may also be effective for making students aware of problem areas in their compositions but were not the focus of this investigation. If computer-assisted intervention is useful, not only would students have an additional evaluation tool, but also human reviewers might be able to concentrate on more important and complex composition concerns without being distracted by the relatively less important technical or surface errors. Further, the more feedback students have on a piece of writing, the more conscious choices they make in selecting particular structures and designs. The syntax checker has the potential to make students more aware of problems and to focus their attention on dealing with those problems (Borthick & Clark, 1987).

The study population was comprised of all Intermediate Accounting II students during the 1986 spring quarter at a major southeastern university. These students were required to write two graded assignments during the quarter: one, an in-class client letter based on material covered in the Intermediate Accounting I course; and the other, a client letter involving a technical explanation of an accounting recommendation. Sixty-five students enrolled in Intermediate Accounting II; of these, fifty-nine completed both writing assignments. Students were in two different sections taught by two different instructors. Data from this study are based only on those students completing both assignments and are relevant only for this population.

Students did not have access to a computer for the first assignment; for the second assignment, all students were required to use PC-WRITE 2.55 to prepare drafts and final papers on the computer. Students turned in both hard-copy and floppy-diskette versions of their papers. The floppy-diskette versions of the second assignment from one of the sections was later used in the RIGHTWRITER analysis.

All students completed the same writing assignments under very similar conditions. The in-class letters were written by all students in a two-hour evening session, and were then scored by a four-person team, including the two instructors for the course and two writing consultants from the College of Business. No writing instruction was provided by the course instructors; both classes received identical two-hour reviews of writing principles and peer evaluation strategies from the writing consultant in the School of Accounting. The out-of class assignment was introduced to both classes via a mimeographed assignment sheet, which indicated audience and purpose and explained that students would need to perform outside library research in order to complete the assignment.

All students in the Intermediate Accounting II course completed the same set of assignments and used the same textbook. Instructors prepared a common syllabus that specified required topics and problems and the timing of assignments and exams. Instructors prepared common exams that were administered jointly in evening sessions to all students at the same time and under the same conditions. These exams were graded using an agreed-upon answer key and weighting scheme. Course expectations were thus jointly defined and implemented by the two instructors; although instructor personalities may have differed, the basic course experience did not. Instructor expectations were similar for the students in this study population; multiple instructors should not be a factor in evaluating the results of this study.

For the second assignment, students were given instruction in using the College of Business's microcomputer facilities by members of the computer support staff. All students completed the required introductory session and were certified to use the College's facilities, which include IBM PCs, IBM PC-XTs, and AT&T 6300 PCs. Students were required to purchase two floppy diskettes: one for the PC-WRITE word-processing program and one for both their letter and the output from the syntax program.

Many students had no experience using either microcomputers or word-processing programs. The second assignment was given with sufficient lead time for students to become reasonably proficient at using PC-WRITE before the first draft was due for review by the peer reviewers and the syntax checker. One instructor spent a total of ten hours over a two-week period working with students in the computer lab to answer questions and to facilitate use of the word-processing program. By the time the final papers were due, all students had mastered the basics of PC-WRITE, and all papers conformed to minimum format expectations. That is, all students were able to master the word-processing program and to use it efficiently for revision.

All students were required to submit their second papers for two peer reviews. Peer review techniques were discussed by the writing consultant and practiced in Intermediate Accounting I, and these techniques were reinforced by the writing consultant in Intermediate Accounting II. All students, therefore, had human feedback on their second papers prior to submission for a grade. Because the accounting program encourages a cooperative spirit among the students, most responded favorably to providing and receiving peer reviews. Final drafts of these papers were scored holistically by the four-person team.

Holistic scoring consisted of using the "General Impression Marking" approach. That is, papers were given a single overall evaluation: A score of one (1) represented superior work, two (2) represented high average work, three (3) represented low average work, and four (4) represented poor work. The four graders, two of whom had experience in holistic grading, spent a three-hour session discussing the assignment, developing general criteria, and grading a test packet of ten papers drawn randomly from the population. The goal of this session was to become confident and proficient in applying the holistic scoring concept. The "General Impression Marking" approach was selected because it corresponds to the type of evaluation practiced by managers: The overall impression of the writing is most important, and the overall impression is hurt by flaws in any major composition component. (See Charney, 1984, for a discussion of holistic scoring approaches and techniques.)

The project was designed using a single-factor analysis of variance model (ANOVA). The first writing assignment was designed to determine if the two classes were statistically similar and to determine if the two graders could achieve an acceptable level of consistency in scoring the papers. The requirements of the second assignment were the same for all students, except that one group received the output from a computerized syntax checker. The two factor levels were thus composed of students who received additional information through a software program and students who did not.

Students did not use RIGHTWRITER themselves; rather, they turned in diskettes containing their papers, and the program was run for them. RIGHTWRITER placed a marked output file on the disk, and treatment-group students were instructed to print out this file and use the syntax checker's comments in their revisions.

The first assignment was scored holistically by four graders after an initial session to determine expectations and criteria for scoring. A measure of reliability, Cronbach's Alpha, was used to assess inter-rater reliability for the scoring of the first assignment. The reliability coefficient was 0.89, indicating strong and consistent agreement among the graders. A t-test was then performed on the data to determine if there was a significant difference between the two sections of the course. Results of the t-test for the first assignment indicated no significant difference in scores between the two classes. (See Table 1.) Therefore, if any significant difference in scores were to occur on the second assignment, the difference might be attributable to the intervention provided by the syntax checker. The basic hypothesis tested was as follows:

Ho: Students who have syntax checker intervention do not have scores different from those who have no such intervention.

Ha: Students who have syntax checker intervention do have scores different from those who have no such intervention.


Table 1: Results and Comparison of Scores Between Assignments 1 and 2

StudentsN MeanSDStd. Error

Scores for Assignment 1 (1)
RIGHTWRITER Users352.614 0.6840.12
Non-Users242.884 0.8240.17
Scores for Assignment 2 (2)
RIGHTWRITER Users352.950 0.6380.11
Non-Users243.240 0.7280.15
Improvement in Scores Between Assignments 1 and 2 (3)
RIGHTWRITER Users350.336 0.6610.11
Non-Users240.396 0.6710.15

(1) t = 1.12; p = 0.27 NS
(2) t = 1.58; p = 0.12 NS
(3) t = 0.34; p = 0.74 NS


Because the main difference between the two groups of students was the additional information provided by the syntax checker, any difference in scores could be attributable to the students' use of the additional information. The results of the ANOVA procedure are contained in Table 2. The ANOVA results indicated that scores for those students receiving the syntax checker feedback were not significantly different from scores for the non-treatment group; no score difference (either improvement or decline) may be attributable to the added intervention.


Table 2: Results of the ANOVA Procedure

Analysis of Variance on Assignment 2
SourceDF SSMSF
Groups11.194 1.1942.61
Error5726.035 0.457
Total5827.229
StudentsN MeanSD
RIGHTWRITER Users35 2.9500.638
Non-Users24 3.2400.728
Pooled SD = 0.676
F(0.95; 1, 57) ~= 4.0, and F* = 2.16; therefore, conclude Ho (there is no difference between the means of the groups).
Analysis of Variance on Score Difference Between Assignments
SourceDF SSMSF
Groups10.051 0.0510.12
Error5725.232 0.443
Total5825.284
Students NMeanSD
RIGHTWRITER Users 350.33570.6613
Non-Users24 0.39580.6713
Pooled SD = 0.6653
F* = 0.12. which is less than 4.0; therefore, conclude Ho (there is no difference between the means of the groups).


The ANOVA procedure on the difference between scores on the first and second assignments also indicates that students who had access to the syntax checker had no significant difference between performances on the first and second assignments. Scores for each group on the second assignment were lower than on the first assignment by nearly the same amount (users = 0.3357; non-users = 0.3958), and the t-test indicates no statistical difference. The second assignment was somewhat more complex than the first, a factor which might explain why overall scores were lower even after students had made extensive revisions of their writing. Cronbach's Alpha was not available for the second set of papers, but Pearson correlation coefficients are higher than for the first set, indicating inter-rater agreement and high reliability (see Table 3).


Table 3: Results of Tests for Correlation Coefficients and for Equal Population Medians

Pearson Correlation Coefficients for Graders
Assignment 1
1 2 3
2 0.622
3 0.691 0.705
4 0.646 0.696 0.762
Assignment 2
1 2 3
2 0.774
3 0.721 0.696
4 0.777 0.753 0.815
Mann-Whitney Test for Equal Population Medians
Group N Median
User 35 3.00
Non-User
24
3.25
 
Test of eta1 = eta2

W = 843
Significant at 0.0587
Cannot reject at alpha = 0.05



Discussion

Some experimental limitations may explain the negative findings in this study. First, the students in the Intermediate Accounting II course were both intelligent and motivated. They have been achievers for many years, and they represent a group whose writing skills are reasonably well developed. Thus, The Wall Street Journal's caveat that competent writers may not experience much help from syntax checkers appears confirmed. Second, all papers for the second assignment underwent two peer reviews, thereby providing substantial human feedback concurrent with the computer output. Students in both groups may have benefited considerably from this human feedback, perhaps to the point that the syntax checker could provide little incremental information to the treatment group. The combination of these two factors may indicate that, in order for good students who work cooperatively to generate effective communication in a business context, a syntax checker offers insufficient additional information. The current status of syntax checkers for microcomputers may be such that competent writers gain very little from their use (Wessell, 1986).

The library component of the second paper may also have been a factor influencing the negative findings of this study, especially because the overall success of the paper was partially a factor of competent research and of skillful inclusion of that research into the composition. The syntax checker can provide no feedback for this component of the assignment. However, even though assignment complexity may explain the lower average scores on the second assignment, it does not account for the lack of difference between the two groups. Scores might not necessarily improve overall (although this was expected), but clearly some difference was anticipated between users and non-users of writing aids. Another explanation for the negative findings may be that the syntax checker works best with short documents, such as one-page memoranda or letters. The average length of the second assignment was five double-spaced pages. RIGHTWRITER provided numerous annotations for most student papers, so it is possible that students could not quickly internalize and use the information. In addition, no formal instruction was provided on how to incorporate the syntax checker into a revision strategy. Students may have had initial difficulty interpreting the output; moreover, because of time constraints, they had no opportunity to see the syntax checker's evaluation of subsequent drafts. One computerized review of a complex assignment may be insufficient to provide adequate feedback for improvement.

Syntax checkers may indeed be valuable to those who have the types of problems on which RIGHTWRITER concentrates, especially passive-voice constructions and long, unwieldy sentences. Some weaker writers may also benefit from increased attention to common problems in their writing, such as the lack of subject-verb agreement or the use of slang, and they may draw upon their prior knowledge to improve their writing. However, there is no hard evidence (other than the 1987 Borthick and Clark study) to show that commercially available software for PC-based syntax checkers actually improves writing. The syntax checker might serve as either a reinforcement or a reminder of things the writer already knows, rather than as a remediation tool for the weaker writer.

Further research needs to be done using the syntax checker across groups with different skill levels, as opposed to a homogeneous group, such as in the current study. Research also needs to be done using different types and lengths of assignments, perhaps even different audiences and writing contexts. In addition, these investigations should address not only how best to incorporate the syntax checker into a writing program, but also how accounting practitioners could use it in "real world" situations. The use of a computerized tool to aid in the composition process is both reasonable and welcomed; perhaps the problem is that the technology simply has not yet caught up with the needs of the writers in the study population. Another possibility is that the students simply ignored the additional information and prepared the assignment as they have done in the past.

Accounting students need more writing practice and more support to develop their overall communication skills. Employers indicate that the single most glaring weakness among new employees is poor writing and lament the lost productivity and increased costs associated with such poor skills. Accounting practitioners want to improve their own writing skills, and they look for ways to facilitate such improvement. Accounting educators recognize the need for more writing and for improved writing instruction. However, they are often unable to provide much assistance to students to improve writing because of course content constraints or a lack of expertise in evaluating writing. Many instructors are reluctant to assign much writing because of the additional burden it creates in a course. A computerized writing aid which would provide intervention to aid the student or practitioner before submitting a document for perusal would minimize the burden and improve efficiency. The syntax checker idea is a good one, but perhaps the implementation needs a bit more work. If computerized writing aids were effective, students could be encouraged to make them a standard component in their revision strategies.

George S. Peek teaches at Western Illinois University in Macomb, Illinois, Tony Eubanks teaches at Georgia State University in Atlanta, Georgia: and Claire May and Patsy Heil teach at the University of Georgia in Athens, Georgia.

References

Andrews, J. D., & Koester, R. J. (1979). Communication difficulties as perceived by the accounting profession and professors of accounting. Journal of Business Communication, 16, 33-42.

Andrews, J. D., & Pytlik, B. P. (1983). Revision techniques for accountants: Means for more effective and efficient written communication. Issues in Accounting Education, 152-163.

Andrews, J. D., & Sigband, N. B. (1984). How effectively does the "new" accountant communicate? Perceptions by practitioners and academics. Journal of Business Communication, 21, 15-24.

Arevalo, C. B. (1984). Effective writing: A handbook for accountants. Englewood Cliffs: Prentice-Hall.

Armitage, H. M., & Boritz, J. E. (1986). Integrating computers into the accounting curriculum. Issues in Accounting Education, 1, 86-99.

Bates, P. (1984, October). How to turn your writing into communication. Personal Computing, pp. 84-85, 87-88, 91, 93.

Borthick, A. F., & Clark, R. L . (1987). Improving accounting majors' writing quality: The role of language analysis in attention directing. Issues in Accounting Education, 2, 13-27.

Borthick, A. F., & Clark, R. L. (1986). The role of productive thinking in affecting student learning with microcomputers in accounting education. The Accounting Review, 61, 143-157.

Catano, J. V. (1985). Computer-based writing: Navigating the fluid text. College Composition and Communication, 36, 309-316.

Charney, D. (1984). The validity of using holistic scoring to evaluate writing. Research in the Teaching of English, 18, 65-81.

Cherry, L. L., Fox, M. L., Frase, L. T., Gingrich, P. S., Keenan, S. A., & Macdonald, N. H. (1983). Computer aids for text analysis. Bell Laboratories Record, 61(5), pp. 10-16.

Cherry, L. L., & Macdonald, N. H. (1983, October). The UNIX WRITER'S WORKBENCH software. BYTE, pp. 241-248.

Collier, R. M. (1983). The word processor and revision strategies. College Composition and Communication, 34, 149-155.

Corman, E. J. (1986). A writing program for accounting courses. Journal of Accounting Education, 4, 85-95.

Cunningham, D. H. (1988). RIGHTWRITER 2.0. The Technical Writing Teacher, 15(1), 84-86.

Daiute, C. (1986). Physical and cognitive factors in revising: Insights from studies with computers. Research in the Teaching of English, 20, 141-159.

Dalton, R. (1985, November). Specialized writing tools. Popular Computing, pp. 21-22, 24, 27.

Frase, L. T., Macdonald, N. H., Gingrich, P. S., Keenan, S. A., & Collymore, J. L. (1981). Computer aids for text assessment and writing instruction. NSPI Journal, 20(9), 21-24.

Gingras, R. T. (1987). Writing and the certified public accountant. Journal of Accounting Education, 5, 127-137.

Harris, J. (1985). Student writers and word processing: A preliminary evaluation. College Composition and Communication, 36, 323-330.

Hawisher, G. E. (1987). The effects of word processing on the revision strategies of college freshmen. Research in the Teaching of English, 21, 145-159.

Ingram, R. W. (Ed.). (1988). Computer integration into the accounting curriculum: Case studies. Sarasota: American Accounting Association.

Ingram, R. W., & Frazier, C. R. (1980). Developing communications skills for the accounting profession. Sarasota: American Accounting Association.

Kiefer, K. E., & Smith, C. R. (1983). Textual analysis with computers: Tests of Bell Laboratories' computer software. Research in the Teaching of English, 17, 201-214.

Lewis, B., & Lewis, R. (1987, June). Do style checkers work? PC World, pp. 246-252.

Lutz, J. A. (1987). A study of professional and experienced writers revising and editing at the computer and with pen and paper. Research in the Teaching of English, 21, 398-421.

May, G. S., & Arevalo, C. (1983). Integrating effective writing skills in the accounting curriculum. Georgia Journal of Accounting, 4, 119-126.

Monahan, B. D. (1984). Revision strategies of basic and competent writers as they write for different audiences. Research in the Teaching of English, 18, 288-304.

Mortensen, T. (1987). Writing style/readability checkers to add to your word processing. Computers and Composition, 5(1), 67-77.

Orem, E., & Burns, J. O. (1988). The problems and challenges of teaching accounting students to communicate. Georgia Journal of Accounting, 9, 9-24.

Raskin, R. (1986, May 27). The quest for style. PC Magazine, pp. 189-192, 198-202, 206-207.

Raskin, R. (1986, May 27). WRITER'S WORKBENCH: The granddaddy of style. PC Magazine, p. 194.

Rebele, J. E. (1985). An examination of accounting students' perceptions of the importance of communication skills in public accounting. Issues in Accounting Education, 41-50.

Sterkel, K. S., Johnson, M. I., & Sjogren, D. D. (1986). Textual analysis with computers to improve the writing skills of business communications students. The Journal of Business Communication, 23, 43-61.

Wessell, D. (1986, July 7). Computer software for writers: Helping the bad, hurting the good. The Wall Street Journal, sec. 15, p. 4.