Challenging Projects and Virtual Labs in Web-enhanced it classes Vladimir Riabov



Download 103.55 Kb.
Date conversion07.06.2016
Size103.55 Kb.

NCTT-8-2005 Challenging Projects and Virtual Labs in Web-enhanced IT Classes V. Riabov

____________________________________________________________________________________


8th NCTT Annual Curriculum Workshop
July 11 - 14, 2005, Springfield, MA




Challenging Projects and Virtual Labs in Web-enhanced IT Classes

Vladimir Riabov

Associate Professor

Department of Mathematics & Computer Science

Rivier College, USA

E-mail: vriabov@rivier.edu

Agenda:

  1. Web-enhanced IT Classes at Rivier College;

  2. Lecture Notes and Web Resources;

  3. Class Assignments;

  4. Virtual OPNET Labs;

  5. Examples of Students’ Projects and Research:

  6. Project Papers (SANs, WiFi, Gigabit Ethernet, etc.)

  7. Digital Video Cluster Simulation with OMNeT++

  8. Code Complexity Analysis for two projects in Networking

  9. Conclusion




Web-enhanced IT Classes at Rivier College

  1. Undergraduate and Graduate Programs in CS & CIS

http://www.rivier.edu/departments/mathcs/home/cs/CSIndex.htm

  1. Certificates in Networking and Information Technologies

  2. IT-related Courses:

  3. CS553: Introduction to Networking Technology

  4. CS572: Computer Security

  5. CS573: Advanced Wide Area Networks

  6. CS575: Advanced Local Area Networks

  7. CS597: Multimedia and Web Development

  8. CS612: Information Technology

  9. CS632: Client/Server Computing

  10. CS685: Network Management, and others

  11. Web-enhanced Classes across CS/CIS Curricula

Lecture Notes and Web Resources

  1. Instructor’s Web site (Teaching, Research & Publications):

http://www.rivier.edu/faculty/vriabov/

  1. Web sites for IT Courses

  2. Syllabi

  3. Lecture Notes

  4. Assignments

  5. Schedules

  6. Resources

  7. Examples of Students’ Project Papers

  8. Web Resources

Class Assignments

  1. Warm-up Exercises

  2. Homework Assignments

  3. Virtual Labs

  4. Midterm Exams

  5. Project Papers

  6. Research Reports

  7. Final Exams

Warm-up Exercises (examples)

  1. What is the last digit of the number 25975927 [mod(10)]?

  2. Using MSExcel™ spreadsheet, find the last digit of the number 253 [mod(10)]?

  3. How to use your findings in these two cases for encrypting e-messages?

Last digit of the number 25975927 [mod(10)]?

  1. It’s enough to consider the last digit of a simpler number 75927;

  2. Do your experiments (see Table)!

  3. LAST” can be 7, 9, 3, or 1 only; therefore, it is a cycle of four cases;

  4. The power, 5927 can be represented as 5927 = 4*1481+3;

  5. Therefore, “LAST” of 75927 is the same as the “LAST” of 73, which is “3”.

  6. Answer: “3”.

Last digit of the number 253 [mod(10)]?

  1. LAST” digit can be 2, 4, 8, or 6 only; therefore, it is a cycle of four cases;

  2. The power, 53 can be represented as 53 = 4*13+1;

  3. The “LAST” of 253 is the same as the “LAST” of 21, which is “2”.

  4. Therefore, following this algorithm, the last digit of the number 253 must be 2;

  5. Try MS Excel™Spreadsheet (see Table)!

  6. Why the last digit of the number 253 is 0 there?

  7. HINT: Consider the number of “valuable” digits in large natural numbers calculated with MS Excel™!




Homework Assignments (example 1)

  1. Using Manchester Encoding Format, encode a bit-stream that represents two first letters of your last name previously written in the ASCII (7-bit) Coding Standard. Using MS Word, plot a diagram that illustrates your Manchester code.

Homework Assignments (example 2)

  1. Plot a diagram that illustrates a virtual private connection from your home computer to the Rivier College Network. Briefly describe issues that should be resolved for establishing this connection.

Homework Assignments (example 3)

A LAN has a data rate of r=4 Mbps and a propagation delay between two stations at opposite ends of d=20 ìs. For what range of PDU sizes (S, measured in bits) does the stop-and-wait flow control give an efficiency of at least 50%, E > 0.5? (neglect the transmission time for the ACK signal). The efficiency, E is defined as a ratio of the PDU transmission time (time for inserting the PDU onto the medium) to the total time the medium is occupied for this one PDU.

Visiting the IT Services Department

Virtual OPNET Labs

  1. OPNET IT Guru Academic Edition™:

http://enterprise37.opnet.com/4dcgi/COMMUNITY_HOME

http://www.opnet.com/services/university/home.html

  1. OPNET Virtual Lab Manuals:

  2. http://www.opnet.com/services/university/lab_manuals.html

  3. William Stallings, Business Data Communications, Fifth Edition

  4. William Stallings, Data and Computer Communications, Seventh Edition

  5. Raymond R. Panko, Business Data Networks and Telecommunications, Fourth Edition

  6. Larry L. Peterson and Bruce S. Davie, Computer Networks – A Systems Approach,Third Edition.

Available Virtual Labs

  1. Four-Six OPNET™ Virtual Labs per Course:

http://www1.us.elsevierhealth.com:8300/MKP/Aboelela/manual/index.html

L00: Introduction - Basics of OPNET IT Guru Academic Edition™

L01: Ethernet - A Direct Link Network with Media Access Control

L02: Token Ring - A Shared-Media Network with Media Access Control

L03: Switched LANs - A Set of Local Area Networks Interconnected by Switches

L04: Network Design - Planning a Network with Different Users, Hosts, and Services

L05: ATM - A Connection-Oriented, Cell-Switching Technology

L06: RIP: Routing Information Protocol - A Routing Protocol Based on the Distance-Vector Algorithm

L07: OSPF: Open Shortest Path First - A Routing Protocol Based on the Link-State Algorithm

L08: TCP: Transmission Control Protocol - A Reliable, Connection-Oriented, Byte-Stream Service

L09: Queuing Disciplines - Order of Packet Transmission and Dropping

L10: RSVP: Resource Reservation Protocol - Providing QoS by Reserving Resources in the Network

L11: Firewalls and VPN - Network Security and Virtual Private Networks

L12: Applications - Network Application Performance Analysis

Virtual Lab Basics

Lab Project Editor Window

Lab: Network Expansion Plan

Creating a New Scenario

Creating the Network

Creating the Network (Step 2)

Students’ Project Papers: “Storage Area Networks (SANs)”

Students’ Project Papers on Selected Networking Protocols

Students’ Project Papers: “WiFi Technologies”

Research Project: Digital Video Cluster Simulation with OMNeT++

Digital Video Cluster Simulation (continue)

Digital Video Cluster Simulation (results)




Research Project: “Networking Software Studies with the Structured Testing Methodology”

McCabe’s Structured Testing Methodology Approach and Tools for Networking Software Development

  1. McCabe’s Structured Testing Methodology is:

a unique methodology for software testing proposed by McCabe in 1976;

approved as the NIST Standard (1996) in the structured testing;

a leading tool in computer, military, and aerospace industries

(HP, GTE, AT&T, Alcatel, GIG, Boeing, NASA, etc.) since 1977;

provides Code Coverage Capacity.

  1. Author’s Experience with McCabe IQ Tools since 1998

leaded three projects in networking industry that required Code Analysis, Code Coverage, and Test Coverage;

completed BCN Code Analysis with McCabe Tools;

completed BSN Code Analysis with McCabe Tools;

studied BSN-OSPF Code Coverage & Test Coverage;

included these topics into Software Engineering and Networking classes since 1999.

McCabe’s Structured Testing Methodology Basics

  1. The key requirement of structured testing is that all decision outcomes must be exercised independently during testing.

  2. The number of tests required for a software module is equal to the cyclomatic complexity of that module.

  3. The software complexity is measured by metrics:

cyclomatic complexity, v

essential complexity, ev

module design complexity, iv

system design, S0, and system integration complexity, S1,

Halstead metrics, and 52 metrics more.

  1. The testing methodology allows to identify unreliable-and- unmaintainable code, predict number of code errors and maintenance efforts, develop strategies for unit/module testing, integration testing, and test/code coverage.

Basics: Analyzing a Module

  1. For each module (a function or subroutine with a single entry point and a single exit point), an annotated source listing and flowgraph is generated.

  2. Flowgraph is an architectural diagram of a software module’s logic.

Flowgraph Notation (in C)

Flowgraph and Its Annotated Source Listing

Would you buy a used car from this software?

  1. Problem: There are size and complexity boundaries beyond which software becomes hopeless

  2. Too error-prone to use

  3. Too complex to fix

  4. Too large to redevelop

  5. Solution: Control complexity during development and maintenance

  6. Stay away from the boundaries.

Important Complexity Measures

  1. Cyclomatic complexity: v = e - n + 2

(e=edges; n=nodes)

  1. Amount of decision logic

  2. Essential complexity: ev

  3. Amount of poorly-structured logic

  4. Module design complexity: iv

  5. Amount of logic involved with subroutine calls

  6. System design complexity: S0 = iv

  7. Amount of independent unit (module) tests for a system

  8. System integration complexity: S1 = S0 - N + 1

  9. Amount of integration tests for a system of N modules.

Cyclomatic Complexity

  1. Cyclomatic complexity, v - A measure of the decision logic of a software module.

  2. Applies to decision logic embedded within written code.

  3. Is derived from predicates in decision logic.

  4. Is calculated for each module in the Battlemap.

  5. Grows from 1 to high, finite number based on the amount of decision logic.

  6. Is correlated to software quality and testing quantity; units with higher v, v>10, are less reliable and require high levels of testing.

Essential Complexity - Unstructured Logic

Essential Complexity, ev

  1. Flowgraph and reduced flowgraph after structured constructs have been removed, revealing decisions that are unstructured.

  2. Essential complexity helps detect unstructured code.

Module Design Complexity, iv

Module Metrics Report

Low Complexity Software

  1. Reliable

  2. Simple logic

  3. Low cyclomatic complexity

  4. Not error-prone

  5. Easy to test

  6. Maintainable

  7. Good structure

  8. Low essential complexity

  9. Easy to understand

  10. Easy to modify

Moderately Complex Software

  1. Unreliable

  2. Complicated logic

  3. High cyclomatic complexity

  4. Error-prone

  5. Hard to test

  6. Maintainable

  7. Can be understood

  8. Can be modified

  9. Can be improved

Highly Complex Software

  1. Unreliable

  2. Error prone

  3. Very hard to test

  4. Unmaintainable

  5. Poor structure

  6. High essential complexity

  7. Hard to understand

  8. Hard to modify

  9. Hard to improve

McCabe QA

McCabe QA measures software quality with industry-standard metrics

  1. Manage technical risk factors as software is developed and changed

  2. Improve software quality using detailed reports and visualization

  3. Shorten the time
    between releases

  4. Develop contingency
    plans to address
    unavoidable risks

Processing with McCabe Tools

Project B: Backbone™ Concentration Node

  1. This system has been designed to support carrier networks. It provides both services of conventional Layer 2 switches and the routing and control services of Layer 3 devices.

  2. Nine protocol-based subtrees of the code (3400 modules written in the C programming language for BGP, DVMRP, Frame Relay, ISIS, IP, MOSPF, OSPF2, PIM, and PPP protocols) were analyzed.

Project-B Protocol-Based Code Analysis

  1. Unreliable modules: 38% of the code modules have the Cyclomatic Complexity more than 10 (including 592 functions with v > 20);

  2. Only two code parts (FR, ISIS) are reliable;

  3. BGP and PIM have the worst characteristics (49% of the code modules have v > 10);

  4. 1147 modules (34%) are unreliable and unmaintainable with v > 10 and ev > 4;

  5. BGP, DVMRP, and MOSPF are the most unreliable and unmaintainable (42% modules);

  6. The Project-B was cancelled.

Project-B Code Protocol-Based Analysis (continue)

  1. 1066 functions (31%) have the Module Design Complexity more than 5. The System Integration Complexity is 16026, which is a top estimation of the number of integration tests;

  2. Only FR, ISIS, IP, and PPP modules require 4 integration tests per module. BGP, MOSPF, and PIM have the worst characteristics (42% of the code modules require more than 7 integration tests per module);

  3. B-2.0.0.0int18 Release potentially contains 2920 errors estimated by the Halstead approach. FR, ISIS, and IP have relatively low (significantly less than average level of 0.86 error per module) B-error metrics. For BGP, DVMRP, MOSPF, and PIM, the error level is the highest one (more than one error per module).

Comparing Project-B Core Code Releases

  1. NEW B-1.3 Release (262 modules) vs. OLD B-1.2 Release (271 modules);

  2. 16 modules were deleted (7 with v >10);

  3. 7 new modules were added (all with v < 10, ev = 1);

  4. Sixty percent of changes have been made in the code modules with the parameters of the Cyclomatic Complexity metric more than 20.

  5. 63 modules are still unreliable and unmainaitable;

  6. 39 out of 70 (56%) modules with v >10 were targeted for changing and remained unreliable;

  7. 7 out of 12 (58%) modules have increased v > 10;

  8. Significant reduction achieved in System Design (S0) and System Integration Metrics (S1):

S1 from 1126 to 1033; S0 from 1396 to 1294.

  1. New Release potentially contains 187 errors (vs. 206 errors) estimated by the Halstead approach.

  2. Nevertheless, the Project-B was cancelled.

Project C: Broadband Service Node

  1. Broadband Service Node (BSN) allows service providers to aggregate tens of thousands of subscribers onto one platform and apply customized IP services to these subscribers;

  2. Different networking services [IP-VPNs, Firewalls, Network Address Translations (NAT), IP Quality-of-Service (QoS), Web steering, and others] are provided.

Project-C Code Subtrees-Based Analysis

  1. THREE branches of the Project-C code (Release 2.5int21) have been analyzed, namely RMC, CT3, and PSP subtrees (23,136 modules);

  2. 26% of the code modules have the Cyclomatic Complexity more than 10 (including 2,634 functions with v > 20); - unreliable modules!

  3. All three code parts are approximately at the same level of complexity (average per module: v = 9.9; ev = 3.89; iv = 5.53).

  4. 1.167 Million lines of code have been studied (50 lines average per module);

  5. 3,852 modules (17%) are unreliable and unmaintainable with v > 10 and ev > 4;

  6. Estimated number of possible ERRORS is 11,460;

  7. 128,013 unit tests and 104,880 module integration tests should be developed to cover all modules of the Project-C code.

Project-C Protocol-Based Code Analysis

  1. NINE protocol-based areas of the code (2,141 modules) have been analyzed, namely BGP, FR, IGMP, IP, ISIS, OSPF, PPP, RIP, and SNMP.

  2. 130,000 lines of code have been studied.

  3. 28% of the code modules have the Cyclomatic Complexity more than 10 (including 272 functions with v > 20); - unreliable modules!

  4. FR & SNMP parts are well designed & programmed with few possible errors.

  5. 39% of the BGP and PPP code areas are unreliable (v > 10).

  6. 416 modules (19.4%) are unreliable & unmaintainable (v >10 & ev >4).

  7. 27.4% of the BGP and IP code areas are unreliable & unmaintainable.

  8. Estimated number of possible ERRORS is 1,272;

  9. 12,693 unit tests and 10,561 module integration tests should be developed to cover NINE protocol-based areas of the Project-C code.

  10. The decision has been made in re-designing the Project-C software and developing a new system prototype.

Correlation between the Number of Error Submits, the Number of Unreliable Functions (v > 10), and the Number of Possible Errors for Six Protocols

Correlation between the Number of Customer Reports, the Number of Unreliable Functions (v > 10), and the Number of Possible Errors for Five Protocols

The Structured Testing Methodology has done for us:

  1. Identified complex code areas (high v).

  2. Identified unreliable & unmaintainable code (v >10 & ev >4).

  3. Predicted number of code errors and maintenance efforts [Halstead B, E-, and T-metrics].

  4. Estimated manpower to develop, test, and maintain the code.

  5. Developed strategies for unit/module testing, integration testing.

  6. Provided Test & Code Coverage [paths vs. lines].

  7. Identified “dead” code areas.

  8. Improved Software Design and Coding Standards.

  9. Improved Reengineering Efforts in many other projects.

  10. Validated Automated Test Effectiveness.

Conclusions

  1. Web-enhanced classes in Networking Technologies and other related areas provide students with better instructional support than “traditional” classes;

  2. Warm-up in-class exercises, homework assignments, lecture notes, field trips to IT Services, and virtual labs help students being familiar with modern state-of-the-arts networking technologies;

  3. Students’ challenge projects and research become vital components of their active studies at colleges that help students finding jobs and being promoted in the networking companies;

  4. Instructional openness and support become powerful resource for students in classroom and in their future professional life.






The database is protected by copyright ©essaydocs.org 2016
send message

    Main page