Skip Navigation
small NCES header image
Forum Unified Education Technology Suite
  Home:  Acknowledgments and Introduction
     
  Part 1:  Planning Your Technology Initiatives
     
  Part 2:  Determining Your Technology Needs
       
  Part 3:  Selecting Your Technology Solutions
       
  Part 4:  Implementing Your Technology
       
  Part 5:  Safeguarding Your Technology
       
  Part 6:  Maintaining and Supporting Your Technology
       
  Part 7:  Training for Your Technology
       
  Part 8:  Integrating Your Technology
       
  Appendix A: Sample
Acceptable Use
Agreements and Policies
       
  Appendix B: FERPA Fact Sheet
       
  Appendix C: Web Guidelines
       
  Appendix D: Sample Security Agreements
       
  Glossary
     
  List of Tables and Figures
     
    Powerpoint Overview (700KB)
    Contact:
NCES Webmaster
   
Part 4: Implementing Your Technology


Implementing a Solution

There is no "canned" plan for implementation that will apply to all schools, school districts, and state education agencies. The specifics of what must be done, and what constitutes a reasonable schedule for doing so, depend to a great extent on the choices that have been made with regard to the hardware, software, instructional applications, custom products, and network linkages selected. The schedule will reflect circumstances particular to the organization and the project.

Though implementation is just one phase in the overall process of putting a solution in place, it is wise to treat it as a self-contained project of its own, just as was recommended for the Needs Assessment. Once viewed this way, normal project management rules and practices can be applied.

Assembling an Implementation Team

A great deal may have been accomplished to this point through pure "guerrilla action," in which a small band of committed partisans carry the load in the face of external resistance. Now, however, is the time to get official recognition and visibility, if for no other reason than that the commitment of funds and amount of work involved in successfully implementing a technology initiative requires oversight by senior decision-makers in the organization.

Selecting an Implementation Project Manager

The Implementation Project Manager (IPM) is the key player to whom everyone else involved with the project will look for direction. This person needs to have enough authority to direct the team and make day-to-day decisions. The importance of selecting the right person for this job cannot be overstated. The preferred candidate is someone who has proven leadership skills and a track record of making good things happen. Ideally it will be someone who has successfully managed similar projects in the past or has been a member of comparable project teams. If no one suitable is available within the organization, it may be worthwhile to look to other sources for help, including external contractors.

Technology doesn’t implement itself—people implement technology.

Establishing a Project Team

The IPM oversees the efforts of the project team, which consists of people who are focused on the success of the project. The project team must have sufficient manpower to complete the tasks at hand, but shouldn’t be larger than necessary to get the job done. The IPM should keep in mind that ‘more’ is not always better, and that throwing extra people at a project can often lengthen, rather than shorten, the process. If the project team is assembled specifically for the implementation effort by borrowing staff from other parts of the organization, agreements must be established about the percentage of each member’s time the project will demand. It is also important to give the project team the resources (money, time, equipment, and authority) it needs to get the job done.

The project team needs a strong leader, the IPM (Implementation Project Manager), in addition to all resources needed to accomplish the task… money, time, equipment, and authority.

Appointing a Steering Committee

In the spirit of maintaining checks and balances, consider organizing a project steering committee. This group should meet periodically to evaluate the work of the IPM and the planning team by reviewing progress and addressing any outstanding issues. Members might include:

  • users who will eventually have to accept the solution you have selected
  • a technical authority from your organization
  • knowledgeable outside advisors
  • the implementation project manager (but not as chair)

Developing a Project Implementation Plan

A thorough and realistic project plan is critical to making the team’s effort efficient and effective. It should focus primarily on tangible tasks at hand—what specifically needs to be done, where, when, and by whom. As the project progresses, the plan should also reflect what has been completed.

Establish a realistic schedule for each phase of the process—what will be done,
where, by whom, and when.

Using Project Management Software

Any project that lasts longer than about two months or has more than 8-10 component tasks will probably be made easier through the use of Project Management (PM) Software. PM software such as Microsoft Project™, Timeline™, or SureTrak Project Manager™ can be run on standard desktop computers. These software packages offer similar basic tools to help manage projects, including: integrated calendars, report generators, scheduling, charting, tracking, prioritizing and more. Choose the package with the interface (look and feel) that the IPM prefers, and one that will function on the computer that will be used for project management. The initial effort required to enter project data into PM software generally pays great dividends as the work unfolds. If project team and steering committee members are connected via a network, PM software also makes it easier for them to view, comment on, and participate in the project online.

Establishing a Schedule

The schedule is an important part of any implementation plan. It tells participants when they should expect to arrive where they are going. A schedule is only effective, however, if its goals and deadlines are realistic. If the goals are unattainable and deadlines are missed, subsequent deadlines lose their credibility. Some people in the computer industry create schedules by estimating the amount of time they think it will take to do a job, and then doubling it. Such a strategy may not apply to every situation, but it underscores the point that selecting and implementing technology can be a very complicated process (see Figure 4.1).

If the organization has hired an outside consultant to manage implementation, establishing a schedule should be a part of the contract. If the organization is using internal staff, these team members should be involved in developing in the project schedule. The schedule should cover what will be done, where it will be done, who will do it, and when it will be done for each phase in the implementation process. Any payment to outside consultants or contractors should be based on the submission of specific deliverable items according to the agreed upon schedule. It might also be wise to include a "liquidated damages" clause in all contracts with outside organizations, which requires the contractor to pay a penalty whenever work performed fails to meet contracted obligations. Keep in mind, however, that contractors often reduce their risk from liquidated damages by submitting higher initial bids.

Warning Signs When Scheduling!

When scheduling an implementation, be sure to watch out for:

  • projected deadline dates being overruled for "political" reasons, especially in the absence of additional resources
  • schedules that assume early implementation will be as smooth as later implementation.
  • "All or nothing" implementation strategies—in other words, large projects without a phasing-in process
  • outside pressures that result in unrealistic schedules (which are doomed to failure)
  • developers turning over the project to those who are responsible for implementation without having first trained them adequately

Monitoring Implementation Progress

A key role of the Implementation Project Manager is to monitor progress on an ongoing basis. Doing so is best accomplished when team members report to the project manager on a regular cycle (e.g., weekly or bi-weekly). The project manager then integrates the information into a periodic status report. Gantt charts produced by project management software are a good vehicle for displaying and updating information on a project’s current status and progress-to-date. An example of a Gantt chart is shown in Table 4.1.

Handling Schedule Slippage

A key issue that often arises throughout the course of many implementation projects is "schedule slippage." Dealing with missed deadlines can be complicated but, as they say, honesty is generally the best policy. Breaking bad news as it arises (and thereby giving people an option of dealing with it) is usually a better plan than waiting to deliver a monumentally bad report all at once.

Verifying that the System Works

Although it may seem like a long way off during the planning stage, at some point, the implementation process will be well underway (and even approaching completion). The software will be integrated, the equipment will be installed, and the site will be prepared—perhaps even on schedule. The day will be fast approaching when the new system will be "ready." But how will you know when it is really done? How will you measure its success? To find out whether the job was done correctly the system must be fully in use, but you want to verify the system’s completeness and proper functioning in advance of full user loads.

Proper system testing is a three-step process:

  1. Each component must be tested individually.
  2. The entire system (i.e., all of the components) should be tested to ensure that the pieces work together.
  3. The system should be subjected to "live testing" that simulates real usage, with a similar workload, distribution of users, and processing volumes that will occur on a typical day.

Testing Hardware and Software

Whether the project is a custom development process or implementation of an off-the-shelf package, each product must be tested as it is brought online. Technical team members who are developing, integrating, or customizing the system must incorporate hardware and software testing as part of their routine work to verify that each product does what it was designed to do—hopefully by referencing pre-approved objective specifications. Specifications might originate in the Functional Specifications document prepared during project planning, product documentation supplied by a vendor, or industry-accepted standards available in published reports.

Testing Integration

The overall functioning of the system requires that the set of components and software applications work together. The only way to verify this is to enter information into one part of the system and check to see that it is dealt with properly and transmitted throughout the other parts of the system (see Figure 4.2). In a simple software package that provides for storage and retrieval of student information, this means that data can be entered and then tracked to ensure that access is provided (and limited) as needed to generate desired reports. In transaction processing systems (such as financial packages), it means that each type of transaction that the system is supposed to handle is shown to work. With instructional applications, it means that all computers used by teachers and students should be inspected to verify that software can be accessed as planned. Oftentimes integration testing must be done repeatedly in order to verify appropriate performance.

Things to avoid during testing.

  • Insufficient time planned between system testing and implementation
  • Untested manual procedures
  • Untested interfaces

Testing Performance

Even if the system appears to be working, and integration tests confirm as much, the system must prove itself under actual working conditions. That is, does it hold up when accessed by its routine number of users to perform its routine volume of transactions? To realistically test system performance, it may be necessary to enlist volunteers to "bang away" at the system, thereby simulating "normal" usage levels. Be careful, however, to rely upon actual performance measures that indicate success rather than more subjective measures of user approval (e.g., "It really looked good to me!").

Don't confuse user approval with system functionality.

Testing the Software Interface

If the new system involves interfaces with external systems, the importance of verifying that the new interface works properly cannot be overstated. Unfortunately, sometimes a new system (even with precisely the same specifications as the old system) needs to be modified in order to establish contact with external software applications that had worked just fine with the old system. It’s hard to explain, but true. Changes to existing applications should be identified, specified, and incorporated into the implementation process for all new applications.

Converting from Old Information Systems

Conversion is the task of moving information from an existing computer system (or from paper files) to a new system. Conversions can open the doors to welcome changes—out with the old and in with the new! But the process of transition can be painful. Sometimes it helps to ease the pain if the transition is made gradually—i.e., maintaining the integrity of the old system while simultaneously running a parallel new system. In the case of technology, a conversion of data systems is an opportunity to dispose of unneeded files and records (as long as laws related to maintenance of records are followed) and to establish new, streamlined, and efficient systems.

Conversions are most successful when plans have been made to automate the conversion, test translators more than once, and operate the old and new systems in parallel
until the transition is complete.

Knowing the Process for Conversion

The conversion process must be well planned and implemented if costly delays and loss of productivity are to be avoided. This process is the joint responsibility of developers and users. The developer is responsible for establishing technical capabilities. Users are responsible for evaluating whether the new technologies are actually working. The results of these efforts must be documented and verified.

Avoiding Problems in a Conversion

Organizations typically underestimate the time and resources required for a smooth conversion from old systems to new. Moreover, conversions sometimes fail. A well thought out fallback plan prevents serious business interruptions. However, fallback plans are not always practical due to data synchronization problems. Be aware of the following planning errors that could trigger major setbacks to your implementation effort:

  • extensive manual record conversion
  • insufficient testing
  • insufficient controls or audit trails
  • absence of a fall back plan

Forewarned is forearmed! See Figure 4.3 for additional suggestions to help with system conversion.

Implementing the Changeover of Information Systems

Once the information has been converted, the new system may be physically ready for use. (User training is, of course, another issue.) However, transition probably involves much more than simply "flipping the switch." The changeover process still requires careful management. It’s generally a good idea to run the two systems (old and new) in parallel until staff members are satisfied that the new one performs as envisioned. While this will involve extra work for both users and technical support-staff, the risk reduction and peace of mind it provides is almost always worth the trouble. After all, doing so pretty much ensures that there will not be an interruption in the service and functions being supplied.

Arranging for System Handover

The final step in the implementation process is the handover—the point in time at which the organization deems that the technology system is complete and ready for routine usage. Be sure to verify all components and all user groups are functioning as planned.

Handover can be an exciting and rewarding, albeit nerve wracking, milestone. It is the culmination of a great deal of work and worry. It is also critical from a contractual viewpoint. If a system is based on a commercial vendor’s product, this is the date at which the warranty period commences. If a contractor was hired to perform custom development, the warranty period for their code also begins at handover (during which they will correct errors at their expense).


Previous Page -- Part 3 Next Page -- Part 5

Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education