It is spelled out, right there in the code of ethics of the Association for Computing Machinery, one of the world’s oldest and largest computer professional societies: “An essential aim of computer professionals is to minimize negative consequences of computer systems, including threats to health and safety.”

Computer professionals, the code continues, should warn their bosses about any such threats and even blow the whistle if their superiors ignore them.

So why didn’t all those computer professionals do the ethical thing and prevent or blow the whistle on the year 2000 computer problem – a glitch that some experts warn could cause such threats to health and safety as malfunctioning medical equipment and wayward airplanes when the clock ticks over to Jan. 1, 2000?

Just how much programmers and their managers should be faulted for this problem – which, after all, was decades in the making – is but one question being raised about a profession that usually labors in obscurity but plays an increasingly vital role in society.

Some experts say the year 2000 problem could lead to changes in the way programs are written, just as earthquakes can lead to stricter building codes. Moreover, Y2K, as the problem is known, has given new life to a debate about whether programmers should be required to obtain professional certification, similar to that required of doctors, lawyers and certain engineers.

“You don’t want an unlicensed engineer working on a bridge, but you have unlicensed computer programmers working all the time,” said Dr. Marsha C. Woodbury, chairwoman of Computer Professionals for Social Responsibility, an 1,800-member group that deals with social consequences of computing.

Any pressure for change could evaporate, of course, depending on the extent of computing problems that occur after the stroke of midnight on Dec. 31. That is when millions of computers could fail to recognize that the two-digit year date “00” means 2000 and not 1900. But if colossal disasters ensue, many experts say that the public may demand that the Government step in, perhaps by regulating programmers.

Even now, well before New Year’s, many computing professionals see the close of an era. “This is the end of 50 years of infatuation and unconstrained trust in what information technologists can deliver,” said Paul A. Strassmann, a former chief information officer for the Defense Department and the Xerox Corporation.

As Mr. Strassmann notes, the Government has already stepped in, as with the Securities and Exchange Commission’s requirement, which took effect last fall, that companies file detailed reports on the costs, timetable and other implications of their repair efforts. That was the first such detailed reporting required by the Government for data processing operations. And it is one reason that Mr. Strassmann formed the Software Testing Assurance Corporation, aimed at verifying companies’ year 2000 cleanup efforts.

Even without Government pressure, big companies are finding that the demands of cataloguing all their software and repairing and testing it has forced them to improve their own data processing procedures. Indeed, this could be the silver lining of the year 2000 cloud.

“I do think it helped us all grow up a little,” said William D. Friel, chief information officer at the Prudential Insurance Company of America. Prudential now keeps better track of what programs it owns, has eliminated unnecessary programs and has bolstered procedures for testing software, he said.

For all its high-technology image, developing software remains more art than science, and a fairly unpredictable art at that.

Last year, 46 percent of big corporate software development projects were either late or over budget, and 28 percent failed completely, according to a survey of 7,500 projects by the Standish Group of research advisers in Dennis, Mass.

Though the practitioners may speak of themselves as professionals, computing does not have the educational requirements and licensing that many other professions do. “Many have called themselves software engineers,” said John R. Speed, executive director of the Texas Board of Professional Engineers. “Wrong. They’re the local music dropout who chooses to use that title.”

Texas last year became the first state to offer certification for professional software engineers, although it is not required. Candidates must have accredited degrees and at least 12 years’ experience or unaccredited degrees and at least 16 years’ experience. A certification test is also being developed.

Like other states, Texas has long required that anyone who designs a project that is for the public sector or that affects public safety must be licensed as a professional engineer. Such requirements were put in place after 1937, when an explosion at an elementary school in New London, Tex., killed more than 300 children. The cause was a faulty regulator on a gas steam heater. Nowadays, Mr. Speed said, the function performed by that mechanical device would be handled by software.

Elsewhere, the idea of licensing or certifying computer programmers has long been discussed but never put in place, mainly because programmers are in huge demand and many of the best do not have formal training or education.

In any event, some experts say it is not clear that certification would have prevented the year 2000 problem.

The problem had its genesis in the 1950’s and 1960’s, when using two digits instead of four to represent the year in programming could save precious and expensive computer memory. It was assumed that the programs would all be replaced by the year 2000. If anything, some computer professionals say, the millennium bug is a testament to how well programmers did their jobs, not how poorly.

“The software was so adaptable it lasted all these years, so we didn’t have to retire it,” said Tom DeMarco, a programmer, author and consultant in Maine. “Our undoing was we did it so much better than we thought we would.”

Certainly, very few programmers feel pangs of conscience in the way that some nuclear physicists rue having brought the world the Bomb. A few programmers who say they did sound the alarm way back contend that they were ignored by corporate management. Younger programmers see the problem as one caused by their predecessors, bearing no reflection on them.

Donald Gotterbarn, director at the Software Engineering Ethics Institute at East Tennessee State University, said the year 2000 problem resulted from “lack of foresight and bad guesses,” not from ethical lapses.

Where he sees mere mistakes, though, other experts say the problem was a design flaw that could, and should, have been prevented – especially since the two-digit practice continued into the 1990’s, long after computer memory had become cheap and plentiful.

“I would hope we see it for what it is – we’ve made a giant quality mistake here,” said Leon A. Kappelman, an associate professor of business computer information systems at the University of North Texas and co-chairman of the Society for Information Management’s Year 2000 working group.

“It would seem our code of ethics would tell us we shouldn’t have done that,” Dr. Kappelman added, “and once we did it we should have been more proactive.” But the code of ethics of the Association for Computing Machinery, which has 80,000 members in more than 100 countries, has no force behind it. And most programmers probably are not even aware it exists.

As Dr. Woodbury of Computer Professionals for Social Responsibility put it: “Although computer science departments are supposed to teach ethics along with programming, it’s extremely unpopular. People like me find ourselves isolated and even laughed at.”

That could change. The Association for Computing Machinery, in league with the 100,000-member Computer Society of the Institute of Electrical and Electronic Engineers, adopted a new software engineering ethics code late last year. The code calls for programmers to act in the public interest and to “promote an ethical approach to the practice of the profession,” among other measures.

Just who is responsible for the year 2000 glitch could become a legal as well as an ethical question if, as expected, computer problems lead to widespread lawsuits. A wide swath of corporate America, including the computer industry, is backing legislation in Congress that would limit such lawsuits and liability. Last week the Senate at least temporarily shelved the bill, which had become entangled by a number of other, partisan disputes.

But the bill itself has many critics, who see it as an industry attempt to escape responsibility for poor product design. “It’s get-out-of-jail-free cards,” Dr. Kappelman said.

Within corporations, the year 2000 problem has placed new scrutiny on chief information officers. Some have been criticized for not preventing the costly problem. But top management’s greater attention to computer operations can only be good in the long run, some say.

“Technology was just something that happened behind closed doors and now those doors are opened more,” said Lauris Nance, chief information officer of the Public Service Company of North Carolina, a gas utility. “There’s a general better appreciation of the technical staff.” Ms. Nance herself has received several promotions for seizing upon the problem early.

Mr. Strassmann, the former Pentagon technology official, predicts that corporate chief information officers may one day be given fiduciary responsibility that makes them legally accountable for computing operations.

Increased pressure on the information chiefs is leading to change within corporate computing departments. When first confronted a few years ago with the need to check and fix all their programs, many computer departments were not even sure what and how many programs their machines were running. Even when they were identified, many programs lacked documentation – the textual commentaries that are supposed to let a subsequent programmer follow what the author of the software was trying to accomplish.

“Before, you had a lot of people fight you about documentation and stuff,” Ms. Nance said. Now, she said, programmers meet more often as teams, and the company uses outside auditors to help assess the risk of new technology projects.

Whatever changes the year 2000 problem has meant for companies that use and maintain software, it has placed new performance pressures on companies that create software. A question in some pending lawsuits is whether vendors are responsible for fixing software they sold that is not ready for the year 2000. The need for lawyers to bone up on software issues has effectively created a new bar that will be in position to handle software liability suits, even those unrelated to year 2000.

“Future contracts are going to have much tighter provisions about vendor responsibility for unforeseen events,” said Jim Jones, manager of the year 2000 program at the Information Management Forum, an organization of information technology executives from large companies and government agencies.

Meanwhile, for many programmers, long stereotyped as antisocial nerds, the ethics debate has thrust them into an uncharacteristically visible role.

“I don’t know if we’re going to be the heroes or the goats of 1999,” Allan E. Alter, a columnist, wrote in Computerworld, a newspaper for information technology professionals. But, he added, “we must accept our inevitable public role; what the world wants from us is level-headed, trustworthy guidance in a scary time.”