World Library  
Flag as Inappropriate
Email this Article

Programming productivity

Article Id: WHEBN0010780500
Reproduction Date:

Title: Programming productivity  
Author: World Heritage Encyclopedia
Language: English
Subject: Software diagnosis, SAC programming language, Complex instruction set computing, Software project management, Assembly language
Collection:
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Programming productivity

Programming productivity refers to software development issues and methodologies affecting the quantity and quality of code produced by an individual or team. Key topics in productivity discussions have included:

The relative importance of programming productivity has waxed and waned along with other industry factors, such as:

  • The relative costs of manpower versus machine
  • a substantially less expensive global workforce is available via the Internet
    • examples:
      • http://www.wired.com/wired/archive/12.02/india.html?pg=7
      • http://www.wired.com/wired/archive/12.02/india.html
  • The size and complexity of the systems being built
  • Highly publicized projects that suffered from delays or quality problems
  • Development of new technologies and methods intended to address productivity issues
  • Quality management techniques and standards
  • apathy may be a factor (productivity needs to be a goal)

A generally accepted working definition of programmer productivity needs to be established and agreed upon. Appropriate metrics need to be established. Productivity needs to be viewed over the lifetime of code. Example: Programmer A writes code in a shorter interval than programmer B but programmer A's code is of lower quality and months later requires additional effort to match the quality of programmer B's code; in such a case, it is fair to claim that programmer B was actually more productive.

Hardware aspects of programmer productivity

It is unfair to measure programmer productivity without factoring in the software and hardware tools that have been provided to the programmers being measured. Example: a programmer with two displays is likely to be more productive than a programmer with a single display. With solid state drives becoming less expensive, one's hardware can be fine tuned for faster compilation as is required by new development paradigms such as TDD (test driven development).

An extensive literature exists dealing with such issues as software productivity measurement, defect avoidance and removal, and software cost estimation. The heyday of such work was during the 1960s-1980s, when huge mainframe development projects often ran badly behind schedule and over budget. A potpourri of development methodologies and software development tools were promulgated, often championed by independent consultants brought in as troubleshooters on critical projects. The U.S. Department of Defense was responsible for much research and development in this area, as software productivity directly affected large military procurements.

In those days, large development projects were generally clean-sheet implementation of entire systems, often including their own system-level components (such as data management engines and terminal control systems). As a result, large organizations had enormous data processing staffs, with hundreds or thousands of programmers working in assembly language, COBOL, JOVIAL, Ada, or other tools of the day.

Modern computer use relies much more heavily on the use of standardized platforms and products, such as the many general-purpose tools available today under Linux and the Microsoft operating systems. Organizations have more off-the-shelf solutions available, and computer use is a basic job requirement for most professionals. Tasks that once would have required a small development team are now tackled by a college intern using Microsoft Excel. The result has been a trend toward smaller IT staffs and smaller development projects. With larger projects, techniques like rapid prototyping have shortened development project timelines, placing a priority on quick results with iterative refinement. Traditional programming-in-the-large has thus become rare – the domain of industry giants like Microsoft and IBM. As a result, although programming productivity is still considered important, it is viewed more along the lines of engineering best practices and general quality management, rather than as a distinct discipline.

A need for greater programmer productivity was the impetus for categorical shifts in programming paradigms. These came from

  • Speed of code generation
  • Approach to maintenance
  • Emerging technologies
  • Learning curve (training required)
  • Approach to testing

References

  • Software Cost Estimation with Cocomo II, Barry W. Boehm et al., Prentice Hall, 2000. ISBN 978-0-13-026692-7.
  • Developing Products in Half the Time: New Rules, New Tools, Preston G. Smith and Donald G. Reinertsen, Wiley, 1997. ISBN 978-0-471-29252-4
  • Programming Productivity, Capers Jones, Mcgraw-Hill, 1986. ISBN 978-0-07-032811-2
  • Estimating Software Costs, Capers Jones, McGraw-Hill, 2007. ISBN 978-0-07-148300-1

Internet articles

  • "Coding Horror: Joining The Prestigious Three Monitor Club" (December 2006) http://www.codinghorror.com/blog/2006/12/joining-the-prestigious-three-monitor-club.html
  • "Coding Horror: The Programmer's Bill of Rights" (August 2006) http://www.codinghorror.com/blog/2006/08/the-programmers-bill-of-rights.html
  • "Dual Monitors", Bob Rankin http://askbobrankin.com/dual_monitors.html

External links

  • Ballmer Peak, a humorous take on programmer productivity vs. blood alcohol content
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 


Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.