Saturday, July 22, 2006

Increase Your Computer’s Heart Beat Add RAM To It.

Sometimes the speed of computer becomes so sluggish, that it becomes difficult to tolerate, and we start doing lots of action like rebooting computer or deleting extra folders, but always it does not work.
For slow computer to work fast we need speed and that means lots of RAM, which stands for Random Access Memory. It is a kind of memory used for holding program and data being executed.
When we start our computer and launch any application like browsing the Internet or writing any letter, the basic data required for running these applications and the operating system is loaded into RAM or sometimes just referred to as memory.

New PC when you buy from the market comes with standard amount of memory. The RAM that comes with the computer is the minimum amount requires to run what’s already installed on the system. More powerful programs or application consumes more RAM and due to multitasking nature of many system i.e. running more than one application at a time, consumption of RAM increases. Then there are pop-up programs that run constantly in the background so that you can access them instantly at any time. These programs are also nonstop RAM consumers.

Random Access Memory chips are large rectangular shaped and made with memory cells with support logic that reads and writes data, it is mounted directly onto the motherboard and transfers data by use of address and data buses. Each data bus consists of number of circuits.

RAM speed can be measure in different ways depending on the age of the computer. Older RAM is measured in nanoseconds also written as ns. One nanosecond is one-billionth of a second and is the measurement of access time to or from RAM.

Newer RAM is measured in megahertz and it matches the speed of motherboard to communicate with the processor. A higher RAM speed in megahertz means faster read and writes to the RAM chip.

Due to changing technologies in computer we need to add more software to our system, more powerful applications will be there and more programs will be there, everything will need their share of RAM. If you have less RAM in your system, every process will wait for their turn, which may take time so your system will be slow and you will be irritated.
In order to make your system fast you have to upgrade your system by adding more RAM, now if you finally decided to upgrade the system you would first have to find out what kind of RAM your computer uses. You would need to match its speed, the pin size and based on your system’s RAM you have to choose either tin or gold-based pin RAM. Once everything is done i.e. you got the right RAM, you have to take out old RAM from the socket and rock the new RAM in place of old RAM.

After upgrading your system you will see the satisfying changes in the speed, your system will be faster and in single click you will see your work done. RAM is the energy lifeblood of the computer operations.

Changing Technologies.

If we will go back few years from now and compare the world at that time and what it is now, we will see we have come a long way in short span of time. Lets go back before 1980s when Internet was a new concept and very few people were aware of these triplet WWW i.e. World Wide Web, nobody would have thought at that time, these Ws are going to rule the world in future.
The Internet has revolutionized the computer and communications in the world like nothing before. Initially started as ARPANET and grew into the Internet, is one of the most successful examples of the benefits of sustained investment and commitment to research and development of information infrastructure.
Because of Internet we are so near to our parents and relatives, in spite of being so far geographically. Sometimes it’s the open heart and we write long mails and sometimes it’s only few lines of doing good, but there is a hope in every mail that there is a sign of life, and there is a contact.

Technology is changing at very fast speed, suppose you are out of this technological world for about 90 days, you are far behind all the technological developments.
Lets take a look into the growth of computing hardware, initially the computer was a room size box using all the punch cards then gradually size decreased and now it has become so light and small that it can sit on our lap as laptop or notebook or tablet PC. Time is not far when technological advances will one day allow computers to be implanted in the human body -- and could help the blinds to see and the deaf to hear.

Isn’t it so great, if we can talk to our computer? Technological advancements are taking place in this field as well, companies are working to produce more effective Voice Recognition Software for computers.
There are many people in this world who cannot use keyboard or mouse due to some reason they lost full use of hands, reason can be anything like spinal cord injury, orthopedic trauma, carpal tunnel, muscle weakness or anything else. In that case they simply have to talk to computer and get its response. Like to open file they simply have to say “Open File” or to close it “Close File”, whatever they want to type regardless of whether they have a disability, they simply have to speak and computer will automatically type for them. New research in this field is going on, where a small chip can be implant in brain of human which will enable paralyzed human to operate computer or even play games just by thinking about it. Now the thoughts can control a computer, a television, fan, light…or anything you can think of.

Many people in this world are very much benefited by digital technologies, hearing impaired people are no longer different than the people those who can hear, hearing impaired people can hear by wearing strong digital hearing aids, and they can also use telephone with closed captioning, they can read a phone conversation as a listen to it.
Sometimes hearing aids are not effective in providing effective communication, in that case ABI (Auditory Brainstem Implant) can be very effective to the patient, ABI is an audio brain stem implant, which directly sends audio signals to the brain. Part of this device is hardwired directly into the brain stem and small device about the size of a small radio is carried into the patient pocket.

As violence and conflicts are part of human life through out the history. Use of technology to make weapons, fighter jets and devices has changed the conventional wars. Military is using smart bombs, which seeks target and destroy it. These bombs can be operated by computers in the aircraft and guided towards the target, which enabled military to target enemies only. Use of unmanned systems including the robot aircrafts to attack target significantly reduces the risk to human life. Scientist are working on schemes is to build a shield to defend countries from enemy ballistic missiles.

Science and Technologies has changed the way of life human has lived in the past and it will be changing the future, which now can be imagined or seen in the movies.

Thursday, July 06, 2006

Software Testing

Software testing is the most important phase in any software development project, so that we can know whether our project or product is going to be successful or it will fail before it goes live.
We can define software testing as process use to identify the quality of developed computer software, so that the developer of software can compare the results produced by the finished software and the expected one.
Through testing we cannot get totally bug free software we cannot achieve complete correctness through testing, but at least we can find some defects, which are there in the software.

There are many, many different ways and levels to test software; here I am mentioning very basic and common levels of testing, which are usually done for all software. If the software is very complex or real time oriented than more levels can be added to the testing to achieve a reliable software product.

Broadly software-testing levels are categorized into three levels.
Ø Unit Testing
Ø Integration Testing
Ø System Testing

Unit Testing.
In this phase of testing, separate units of software system are tested. This is also known as component testing in which each module is tested alone to find any error in its source code. Programmer performs this testing as part of the development life cycle. The prerequisite for this phase is Unit Test Plan, which should be written from Low-level design document.

Integration Testing.
This testing is done to test the communication between different components of the software; all the modules that a program comprises are being bring together to test whether they are functioning in correct order with their counter part. Modules are typically integrated in a top-down, incremental fashion.
For some projects integration testing can further be divided into two levels: -
Ø Assembly Integration Testing.
Ø System Integration Testing.
During assembly testing, the integration of the software components is tested. During system integration testing, the communication with external systems is tested. Integration Test Plan should be written from High Level design document.

System Testing
This level of testing is done to test the functionality of software as a complete system. System testing is entirely different then integration testing. In integration testing interaction of component are tested while in system testing interaction of complete software with the system is tested.
Besides system functionality and behavior, system testing may include testing configuration, throughput, security, resource utilization, and performance.
After testing levels we will come to test designs. As we all know, before developing any component for the software first we have to design it, same is the case with software testing, before going for testing any software we have to design tests for the system.
Two most common methods for test design are Black box testing and White box testing
Black-box test design treats the system as a "black-box", so it doesn't explicitly use knowledge of the internal structure. Black-box test design is usually described as focusing on testing functional requirements. White-box test design allows one to peek inside the "box”, and it focuses specifically on using internal knowledge of the software to guide the selection of test data. Synonyms for white-box include: structural, glass-box and clear-box. Now we will discuss about some testing techniques, there are many testing techniques for testing any software, here I have mentioned some like:
Ø Regression Testing Regression tests are tests that are run more than once. If the product or their environments have changed since the last time a test was run, it may find a new bug. Regression tests may be automated or manual.
Ø Stress Testing Tests are run to determine if the software can withstand an unreasonable load with insufficient resources or extreme usage.
Ø Performance Testing Tests are run to determine actual performance as compared to predicted performance.
In the software market you can get many tools for software testing, depending on your system and budget you have to choose among many tools. Some of the tools are:
Ø WinRunner
Ø TeamTest
Ø Silk Test
Ø JTest, C++ Test
Ø Junit.
The testing phase, which is the most important phase in the software cycle gets secondary treatment from developers and IT managers, nobody wants to go to the testing team, people are forced to go for testing in any software development team.
Still testing is the only way to determine whether an application will function properly before it is deployed.

Sunday, July 02, 2006

Software Management:Version Control

Version Control System

When I started my job with a startup company I was the 3rd employee in that company. We were working on an idea to make the Internet faster and reliable.
As my company didn’t get millions of dollar in Venture Capitalist (VC) funding, so we were having limited resources for the development. Life was easier and enjoyable between all three of us and we worked hard to make the first demo of the concept. With the help of this demo my company managed to get good funding from VCs and a time line to come up with the first beta release of the product.

We started hiring and in a months time we were a strong team of 10 software developers. All the code and documents, which I was having I had distributed to all the team members. And team members had to build software using those files as the base.
Slowly the complexity increased in the software and team members had to share the files to implement interfaces. Every thing was very smooth because I was handling all the files and was having all the control on the code and files. Then I went for vacation for about 3 weeks and came to office only to know about the disaster, which happened when I was not there.
Analysis: What exactly has happened when I was not in office?
Everybody worked hard and spent more hours in office to make sure that things finished. They finished the coding in their local machine and moved all their files in my machine, which was acting as a server.
When I checked the files in the server it only had the last persons changes in the files, where as rest of others engineers work got over written by last persons changes.
This was a loss of 3 weeks efforts made by team and there was no recovery of the old files or the changes of other team members because there were no backups or version control mechanism.
Proactively I learned the lesson and forced my company to buy a version control system.
What is Version Control System?
Version Control is the task of keeping a software system consisting of many versions and configurations well organized. In other words a Version Control System enables developers to keep historical versions of the files under development and to retrieve past versions. It stores version information for every file (and the entire project structure) in a collection normally called a repository or vobs depending on the Version Control Tools.

How did I choose Version Control System suitable for my needs?
My aim to find a Good source control, secure, easy to manage, scalable, and with little learning curve software, which can easily expended depending on the needs.
I did a research in the market and got the list of latest Version Control System software.
Some of the best VCS software’s are
· Rational Clear case
· Microsoft’s Visual Source Safe
· PVCS
· CVS etc.
I found all the version control systems are having good features and backed by some of the big companies in the software industry. I choose Rational Clear case mainly because of it is available on Linux.
Some of the features of Clear case are
Ø Clear case manages files, directories and other development assets for the entire development cycle.
Ø Parallel development support like automatic branching, merging and differencing technology, more easily resolves conflicts etc.
Ø Transparent, real-time access to files and directories Fast access to any version of any element.
Ø It supports disconnected usage means continue development efforts while disconnected from the network, and it easily synchronize changes when reconnected to the network.
Ø Build and release management Optimize build times, improve reproducibility of builds.
Ø User authentication to control access of software files
Ø Scalability from small team to big team and geographically distributed enterprises Support.

What I achieved after deploying the version control system is the peace of mind and then I didn’t have to worry if any developer would merge a wrong version. I could always get the older version and avoid any last minute surprises.

Sponsers link