| | Assignment 6 (Due: before August 19, 2009, 13:00hrs) | |
|
+69jealou azucena mae m. mara JerusalemAlvaira Ma.AnnKristineTomada Sarah Jean Tisara athina alorro emilio jopia jr. Michael George Guanzon basith_jumat Michelle Adlawan John Cesar E. Manlangit aeros salaga joverly gonzales Jan Neil Enanoria Gador janraysuriba Venus Millena ailaine adaptar karl philip abregana leah_saavedra Tanya Clarissa G. Amancio kate karen rasonable alma cabase AlyssaRae Soriano kristine_delatorre Sheila Capacillo Alfredo V. Ala-an Jovylin O. Sandoval shane sacramento sharlyn joy pines Jovanne Nick Cacayan Russel John Serrano John Paul Pulido creza_jill_bulacito Maria Theresa F. Rulete brian c. namuag Ariel Serenado Shiela Marie P. Nara neil rey c. niere Stihl Lhyn Samonte Gabrielle Anne Rae Deseo mayraflordurango charmaine_dayanan Marren Pequiro desiree florenzie_palma Gleizelle Jen Dieparine Jezreel Jyl P. Hilado Franz Cie B. Suico Norena T. Nicdao Anthony Rigor Aguilar carla comoda jerald jean pullos jojimie Dolorosa G. Mancera IK amielou.falcon vanessa may caneda fatima paclibar Roy Cuevas Jevelyn Labor juvilynconsejo rosemie nunez Joseph Ethel Valdez ♥ilyn_mapalo♥ Marlie E. Sisneros felix a. sumalinog jr. Chris Romarate Jethro Alburo Querubin Admin 73 posters | |
Author | Message |
---|
sharlyn joy pines
Posts : 53 Points : 59 Join date : 2009-06-19 Location : Philippines
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Mon Aug 24, 2009 5:40 pm | |
| Introduction
It’s a privilege to work with the university as an It consultant. That is why I would grab the opportunity to serve the university with my knowledge and power to enhance the technology, infrastructure, innovations, steps, processes and etc in order for the internet connectivity be improved.
Customers are demanding fast, reliable and secure Internet connectivity. As a service provider, you need to ensure your connectivity provides the best possible performance.
Being a student, definitely a frustrating experience when being stuck with slow internet speed. It can really piss you off especially when you are downloading, uploading or saving something important and rushing to get it done.
Definition of important Terms:
Technology
Technology is a broad concept that deals with human as well as other animal species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. Systematic knowledge and action, usually of industrial processes but applicable to any recurrent activity. Technology deals with the tools and techniques for carrying out the plans.
Infrastructure
Infrastructure can be defined as the basic physical and organizational structures needed for the operation of a society or enterprise, or the services and facilities necessary for an economy to function.
1. An underlying base or foundation especially for an organization or system. 2. The basic facilities, services, and installations needed for the functioning of a community or society, such as transportation and communications systems, water and power lines, and public institutions including schools, post offices, and prisons.
Innovations
The term innovation refers to a new way of doing something. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. Innovation typically involves creativity, but is not identical to it: innovation involves acting on the creative ideas to make some specific and tangible difference in the domain in which the innovation occurs.
IT consultant (Job Description)
An IT consultant works in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organization’s IT systems.
IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development.
They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties.
Typical Work Activities Task typically involve:
• meeting with clients to determine requirements; • working with clients to define the scope of a project; • planning timescales and the resources needed; • clarifying a client's system specifications, understanding their work practices and the nature of their business; • traveling to customer sites; • liaising with staff at all levels of a client organization; • defining software, hardware and network requirements; • analyzing IT requirements within companies and giving independent and objective advice on the use of IT; • developing agreed solutions and implementing new systems; • presenting solutions in written or oral reports; • helping clients with change-management activities; • project managing the design and implementation of preferred solutions; • purchasing systems where appropriate; • designing, testing, installing and monitoring new systems; • preparing documentation and presenting progress reports to customers; • organizing training for users and other consultants; • being involved in sales and support and, where appropriate, maintaining contact with client organizations; • Identifying potential clients and building and maintaining contacts.
Taking the Case of the USEP, what would be the best technology, infrastructure, innovations, steps and processes that I could suggest to have an improvement of the university’s internet connectivity?
As a recipient of the current situation of the internet connection of this university, I could say that it is not satisfying because sometimes, most of the times rather, the students could experience interruptions like low-speed, internet disconnections, logging, etc. Thus, as a newly hired IT consultant, the following are the possible things and actions that I believed could help the improvement of the internet connections.
As technology grows, so does our need for bigger, better and faster. Over the years, the way information is presented via the Web has changed drastically. Ten years ago being able to center, bold, colored text was something to admire, while today Flash, animations, online gaming, database-driven Web sites, e-commerce and virtual offices are becoming standards. The need for speed has changed the options available to consumers and businesses alike in terms of how and how fast we can connect to the Internet.
Sources:
http://www.mydigitallife.info/2009/02/12/improve-internet-connection-speed-with-cfostspeed/http://www.mydigitallife.info/2009/02/12/improve-internet-connection-speed-with-cfostspeed/
kulang pa ni...
Last edited by sharlyn joy pines on Tue Oct 06, 2009 7:23 am; edited 1 time in total | |
| | | shane sacramento
Posts : 58 Points : 60 Join date : 2009-06-19 Age : 32 Location : davao city
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Tue Aug 25, 2009 9:13 pm | |
| Technology is a broad concept that deals with human as well as other animal species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. However, a strict definition is elusive; "technology" can refer to material objects of use to humanity, such as machines, hardware or utensils, but can also encompass broader themes, including systems, methods of organization, and techniques. The term can either be applied generally or to specific areas: examples include "construction technology", "medical technology", or "state-of-the-art technology".
http://en.wikipedia.org/wiki/Technology
As what I have seen and observed at the university, the hard wares were fine except that some of the computer units is only capable of receiving and sending information at a limit of 10mbps, which makes the internet connectivity to become slower and inaccessible.
The term innovation refers to a new way of doing something. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between invention, an idea made manifest, and innovation, ideas applied successfully. (Mckeown 2008) In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy.
http://en.wikipedia.org/wiki/Innovation
So, as from what I have understood, innovation is broad, meaning it covers a very large area, which may include one or all of the following fields: arts, economics, economics, business, design, technology, sociology, engineering and business. Innovation means creating and designing new ways and better ways to improve something. In the university's case is, its internet connectivity. run out of words | |
| | | IK
Posts : 46 Points : 47 Join date : 2009-06-19
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Aug 26, 2009 12:39 am | |
| [...continuation] Internet connection can be simple if you keep some basic ideas in mind.
For example, remove viruses and spyware from your system, and run a unprivileged user account for Internet access to slow down any future infections from the wild Internet. Viruses, stealthkits (also known as rootkits), and spyware will compete with or take over your Internet connection and prevent you from enjoying it yourself.
So it is important to scan your system routinely for such problems.
Do not compete with other computers for Internet access via a common (shared) connection such as wireless, broadband cable, or telephone service. Do not operate your main computer as a gateway for other household computers to access the Internet. Such shared connections cause the computers to compete with each other and additional delays are inserted for the computers to communicate with each other about sharing the Internet connection.
Telephone modem connections to bulletin boards and Internet access are limited by the Federal Communications Commission to slow speeds useful over voice-grade telephone wiring. So, if you upgrade from telephone modem access to another service such as wireless, broadband, or optical fiber, then such a connection will speed up in a noticeable way.
Where two local wiring options are available to the Internet from your computer, opt for the faster one. For example, a broadband cable modem can offer either Category 5 cable or USB cable connections (USB is faster than Category 5). Also, a broadband cable modem will be faster than a telephone modem, but the telephone modem can serve as a good backup capability, if desired.
If you can afford it, get an optical fiber connection to the Internet or possibly two-way satellite Internet access for highest connection speeds. Optical fiber run directly to the computer (with a small connecting device) has superior connection speeds for two-way networking that no other fielded technology can surpass.
Finally, learn to check your Internet connection speed online, and routinely check it to recognize when Internet congestion or other slowdowns are occurring.
Improve your Broadband Connection
Broadband Connection Speed
The only thing better than a fast broadband Internet connection, is a faster broadband Internet connection. Broadband Internet speed tests allow you to measure your current broadband speed against that of faster broadband Internet connections. There are various programs and software packages that you can purchase through which you can increase the speed of your Internet connection. You can also make adjustments to the hardware components(Upgrade processor speed and memory levels) of your system maximizing your computer’s broadband connection potential. If you’re not looking to purchase additional software / hardware add-ons, there are manual “tweaks” that you can make to your system through which you can boost your broadband speed.
Increasing Your Broadband Internet Speed
Let’s assume that you access the Internet via a broadband LAN line. The following are 3 examples of ways through which you can manipulate your network settings and increase your broadband speed:
Reduce your network latency by increasing the request buffer size
Tests of LAN broadband connections have shown that delays can be caused as a result of the default request buffer size setting of 4356 decimal. As it is, it has been proven that an increase to a 16384 decimal setting can allow for better performance. (Such an increase is only possible if you have the necessary memory) By utilizing this slight “tweak,” you can noticeably increase your Internet speed and broadband networking capabilities.
Altering your network task scheduler
If you’ve encountered long waits when trying to open network folders, then this “tweak” is for you. One of the default settings with broadband networking is that when you open a network folder, your system performs a test of the networked computer in order to search for scheduled tests. By disabling this option for a LAN connection, you can increase your broadband networking speed.
Increasing your network transfer rate
Transfer rate, also referred to as throughput, refers to the speed at which data can be moved from 1 location to another. Network redirector buffers serve the purpose of optimizing your disk performance, and therefore allowing for the fastest possible broadband networking speed. If you increase the number of network redirector buffers functioning on your system, you could greatly increase your throughput. An Internet speed test following this change will yield noticeable results.
Internet Speed Tests
If you are looking to perform an Internet speed test for your system, there are various tools online through which you can provide your ISP, your area code, your connection type, etc., and receive a reading of your broadband speed compared to the top providers in your area. This allows for you to realize if your broadband speed is lacking in comparison to others, and work to maximize your broadband networking potential. This can be achieved through implementation of the above tweaks or through hardware upgrades and software purchases.
Checking Your Internet Connection Speed with Your Service Provider: -Go to your Internet service provider's website and log into your account. Your Internet connection speed should be listed in the details of your account.
Checking Your Internet Connection Speed through a Speed Test Website: -There are several websites on the Internet which will perform an Internet connection speed test check for you. Some links are: www.speakeasy.net/speedtest http://www.speedtest.net/ http://us.mcafee.com/root/speedometer/default.asp
Checking Your Internet Connection Speed in Windows XP: -Right click or roll over your Internet connection icon that usually appears in the right hand bottom corner of your computer. (It typically looks like two computer monitors). The Internet connection speed will be displayed. For example it will say something like 100 mbps . This can also be accessed from the control panel. When you are in the control panel screen select the network connections icon. Select the network connection that you wish to check and double click on it, a dialog box will then display your speed amongst other details.
Checking Your Internet Connection Speed in Windows Vista: -Click on the Internet connection icon that usually appears in the right hand bottom corner of your computer. (It typically looks like two computer monitors).Then click on the connection that pops up. A dialog box called the network and sharing center will pop up. For whichever connection you are trying to view the speed of click the blue view status words and a box will open which details your speed amongst other connection details. This can also be accessed via the control panel. Click the icon for network and sharing center and repeat as above.
References: http://www.umich.edu/pres/inforev2/infrastructure/ http://www.helium.com/
| |
| | | Jovylin O. Sandoval
Posts : 40 Points : 42 Join date : 2009-06-21 Age : 34 Location : Purok Anahaw Buhangin, Davao City
| Subject: Infrastructure Wed Aug 26, 2009 7:00 am | |
| If I were hired by the University President as an IT consultant, I would rather suggest Infrastructure in order for internet connectivity be improved. First in foremost, I will define and discuss first all about the Internet and its OSI Model.
The Internet
The Internet is a worldwide network of computers and computer networks that can communicate with each other using the Internet Protocol. Any computer on the Internet has a unique IP address that can be used by other computers to route information to it. Hence, any computer on the Internet can send a message to any other computer using its IP address. These messages carry with them the originating computer's IP address allowing for two-way communication. In this way, the Internet can be seen as an exchange of messages between computers. The Internet works in part because of protocols that govern how the computers and routers communicate with each other. The nature of computer network communication lends itself to a layered approach where individual protocols in the protocol stack run more-or-less independently of other protocols. This allows lower-level protocols to be customized for the network situation while not changing the way higher-level protocols operate. A practical example of why this is important is because it allows an Internet browser to run the same code regardless of whether the computer it is running on is connected to the Internet through an Ethernet or Wi-Fi connection. Protocols are often talked about in terms of their place in the OSI reference model, which emerged in 1983 as the first step in an unsuccessful attempt to build a universally adopted networking protocol suite.
For the Internet, the physical medium and data link protocol can vary several times as packets traverse the globe. This is because the Internet places no constraints on what physical medium or data link protocol is used. This leads to the adoption of media and protocols that best suit the local network situation. In practice, most intercontinental communication will use the Asynchronous Transfer Mode (ATM) protocol (or a modern equivalent) on top of optic fibre. This is because for most intercontinental communication the Internet shares the same infrastructure as the public switched telephone network.
At the network layer, things become standardized with the Internet Protocol (IP) being adopted for logical addressing. For the World Wide Web, these “IP addresses” are derived from the human readable form using the Domain Name System (e.g. 72.14.207.99 is derived from www.google.com). At the moment, the most widely used version of the Internet Protocol is version four but a move to version six is imminent.
At the transport layer, most communication adopts either the Transmission Control Protocol (TCP) or the User Datagram Protocol (UDP). TCP is used when it is essential every message sent is received by the other computer where as UDP is used when it is merely desirable. With TCP, packets are retransmitted if they are lost and placed in order before they are presented to higher layers. With UDP, packets are not ordered or retransmitted if lost. Both TCP and UDP packets carry port numbers with them to specify what application or process the packet should be handled by. Because certain application-level protocols use certain ports, network administrators can manipulate traffic to suit particular requirements. Examples are to restrict Internet access by blocking the traffic destined for a particular port or to affect the performance of certain applications by assigning priority.
Above the transport layer, there are certain protocols that are sometimes used and loosely fit in the session and presentation layers, most notably the Secure Sockets Layer (SSL) and Transport Layer Security (TLS) protocols. These protocols ensure that the data transferred between two parties remains completely confidential and one or the other is in use when a padlock appears in the address bar of your web browser. Finally, at the application layer, are many of the protocols Internet users would be familiar with such as HTTP (web browsing), POP3 (e-mail), FTP (file transfer), IRC (Internet chat), BitTorrent (file sharing) and OSCAR (instant messaging).
Local Area Networks
Despite the growth of the Internet, the characteristics of local area networks (computer networks that run at most a few kilometres) remain distinct. This is because networks on this scale do not require all the features associated with larger networks and are often more cost-effective and efficient without them.
In the mid-1980s, several protocol suites emerged to fill the gap between the data link and applications layer of the OSI reference model. These were Appletalk, IPX and NetBIOS with the dominant protocol suite during the early 1990s being IPX due to its popularity with MS-DOS users. TCP/IP existed at this point but was typically only used by large government and research facilities. As the Internet grew in popularity and a larger percentage of traffic became Internet-related, local area networks gradually moved towards TCP/IP and today networks mostly dedicated to TCP/IP traffic are common. The move to TCP/IP was helped by technologies such as DHCP that allowed TCP/IP clients to discover their own network address — a functionality that came standard with the AppleTalk/IPX/NetBIOS protocol suites.
It is at the data link layer though that most modern local area networks diverge from the Internet. Whereas Asynchronous Transfer Mode (ATM) or Multiprotocol Label Switching (MPLS) are typical data link protocols for larger networks, Ethernet and Token Ring are typical data link protocols for local area networks. These protocols differ from the former protocols in that they are simpler (e.g. they omit features such as Quality of Service guarantees) and offer collision prevention. Both of these differences allow for more economic set-ups.
Despite the modest popularity of Token Ring in the 80's and 90's, virtually all local area networks now use wired or wireless Ethernet. At the physical layer, most wired Ethernet implementations use copper twisted-pair cables (including the common 10BASE-T networks). However, some early implementations used coaxial cables and some recent implementations (especially high-speed ones) use optic fibres. Where optic fibre is used, the distinction must be made between multi-mode fibre and single-mode fibre. Multi-mode fibre can be thought of as thicker optical fibre that is cheaper to manufacture but that suffers from less usable bandwidth and greater attenuation (i.e. poor long-distance performance).
Wide Area Network
Network infrastructure is the underlying system of cabling, phone lines, hubs, switches, routers and other devices that connect various parts of an organization through a Wide Area Network (WAN). If a sound network infrastructure is in place, most users can connect people and information throughout their organization and beyond to accomplish assigned responsibilities. Without a network infrastructure, such capabilities are available piecemeal, usually to individuals who may have the vision, initiative and resources to create this capability for themselves.
A WAN allows users to communicate with other personnel within the organization through tools such as e-mail systems. The WAN also provides a bridge to the Internet and World Wide Web that allows anyone connected to the WAN to access information and people outside the organization. WANs are usually "closed" through security measures that prevent external third parties from accessing information within the WAN without a password and/or personal identification number.
A key function of a WAN is to connect Local Area Networks (LANs) throughout the colleges. The LAN is housed within a building and serves to connect all users within that building to one local network. By connecting the LAN to a WAN, all LAN users gain access to others in the enterprise and to the electronic world beyond the network. A community college that has every user connected through a LAN to a WAN has established the infrastructure necessary to take full advantage of the telecommunications capabilities that exist today and those that will be available in the future. ACCD's network infrastructure consists of a WAN that connects the college's four campuses to the district IS data operations center and to the satellite campuses. Each ACCD campus connects to the Internet through gigabit Ethernet lines using an Alcatel OS9 router and an Alcatel 7800 Gigabit Ethernet switch located at the central district data operations center.
Gigabit Ethernet is a LAN architecture that supports data transfer rates of 1 gigabit (1,000 megabits) per second. The Cisco 7513 Internet router is used to provide video links to Regions 13 and 20 Education Service Centers (ESC). The networking infrastructure also includes e-mail servers and a Cisco Pix firewall to prevent intruders using an Internet connection from accessing ACCDs internal network.
Some Technical Terms related on Internet Connection
Bandwidth describes the data throughput capacity of a particular communications technology or link. It is closely analogous to the carrying capacity of a water pipe. It is usually measured as the number of bits of information per second that can be transferred, a bit being a single binary digit (either '0' or '1'). A single alphanumeric character is usually represented by a string of eight bits (a byte). Allowing for overheads, the rate at which characters can be transferred over a particular link is roughly one tenth of the specified bit transfer rate. So, for example, at a relatively low transfer rate of 14,400 bits per second (14.4 Kbps) a page of text of say 2,500 characters (approximately 20,000 bits) would take nearly 2 seconds.
A full colour picture (image) could require 100 Kbits to represent a 25x25 mm2 area. Usually images are compressed using a system such as JPEG (for Joint Photographic Experts Group) which can reduce this by a factor of 10 to 100, depending on the richness of the visual information. A video clip, with sound and pictures, is similar to a series of pictures and a 60 second segment using a small frame (75x50 mm2) and a low quality compression system can take up 4 Mbits (500 Kbytes), which would take almost five minutes to download using a 14.4 Kbps line running at full capacity. On the other hand, a full screen broadcast quality image with 720x480 resolution using MPEG-2 (for Moving Pictures Experts Group), such as is used for DVD movies, requires up to 15 Mbps of bandwidth. The various rates are compared in Table 1.
Low bandwidth (or low speed) links are anything below 100,000 bits per second (represented as 100 Kbps). High bandwidth or high speed links are in the range 100 Kbps to 2,000 Kbps which is usually presented as 2 Mbps.
Broadband commonly refers to a data throughput capacity of more than 2 Mbps. The term reflects the ability of such links to handle many different types of information up to and including full motion video and other services requiring very large throughput capability.
Analogue vs Digital. Analogue information is based on signals where some feature of the signal, usually amplitude or phase, varies continuously with time. Digital information (a 1 or 0), on the other hand, is represented by just one of two possible states: high/low (voltage), on/off (signal) etc. Computers deal with digital information. It is often necessary to convert the digital information used in the computers into analogue signals in order for it to be transmitted over a communications link and then convert it back to digital form when received. A modem (modulator/demodulator) is used to carry out the analogue to digital conversion, and its reverse.
The upper limit of data carrying capacity over a normal telephone line for analogue communications is 56 Kbps (and this only under favourable conditions) whereas digital communications can range up to several megabits per second (1 Mbps plus). In general, analogue signals are better over long distances and noisy lines.
Cable modems, which connect to co-axial cable networks, can carry data at speeds up to 2 Mbps. Digital signals can be used for higher data speeds but require high line quality and can usually be sent only over relatively short distances.
Fibre optic cables always transmit digital signals and under appropriate conditions can reach into the Gigabit range (1,000 Mbps plus). Fibre is widely used for the high volume inter city telecommunications, backbone and international links but is only slowly being deployed for business and domestic use.
| |
| | | Jovylin O. Sandoval
Posts : 40 Points : 42 Join date : 2009-06-21 Age : 34 Location : Purok Anahaw Buhangin, Davao City
| Subject: Continuation of Infrastructure Wed Aug 26, 2009 7:06 am | |
| The Internet Connection Infrastructure In information technology and on the Internet, Infrastructure is the physical hardware used to interconnect computers and users. Infrastructure includes the transmission media, including telephone lines, cable television lines, and satellites and antennas, and also the routers, aggregators, repeaters, and other devices that control transmission paths. Infrastructure also includes the software used to send, receive, and manage the signals that are transmitted. In some usages, infrastructure refers to interconnecting hardware and software and not to computers and other devices that are interconnected. However, to some information technology users, infrastructure is viewed as everything that supports the flow and processing of information. Infrastructure companies play a significant part in evolving the Internet, both in terms of where the interconnections are placed and made accessible and in terms of how much information can be carried how quickly. The figure below shows the relationships between some of the key entities which make up the internet connection infrastructure. Some elements of the telecommunications/Internet infrastructure Local loop: This includes the copper wire pairs that link terminals (commonly telephones) to their nearest exchange but may also in some rural areas include multi-access radio technology. Internet Service Provider (ISP): The ISP is integral to connection to the Internet. It is the ISP who provides the Internet Protocol (IP) linking services which allow messages to be routed throughout the Internet 'cloud'. Most ISPs will have one or more broadband or high bandwidth connections to the telecommunications backbone which both allows users to connect to the ISP and links the ISP to other IP service providers on the Internet. Telephone Exchange: Internet connection capacity is normally dependent on the capacity of the link between a user and their local telephone exchange, which will in turn depend on the location and the age and quality of the exchange's equipment. Many Telecom NZ exchanges in rural and congested urban areas were upgraded in the early 1980s and continue to provide good standard telephone services but are not able to provide services and access speeds which are available through a modern exchange. Backbone: The telecommunications backbone is the network which links exchanges to each other and includes both transmission and circuit switching elements. The transmission elements may include copper and optical fibre cabling, and microwave links. The international circuits also include satellite links. Parts of the existing domestic backbone between some provincial centres may require upgrading to support greater digital data flows. Technologies Available and in Use Aside from dedicated fibre-optic and coaxial cable networks and wireless connections, which are available at present, access to the Internet is generally only available over the telephone network. Even for those with satellite connections, the return path from the user to the ISP relies on the telephone network. Thus, in practice for the overwhelming majority, technologies available for Internet connection are limited to those capable of using the copper wire local loop. Technologies available over the local loop V90 Modem: This is presently the 'domestic standard' for achieving a data rate of up to 56 Kbps over a standard telephone line. It requires only an inexpensive modem connecting a personal computer to a telephone line, and will support slow-motion video. There are limiting factors, however. A connection speed of 56 Kbps cannot be realised if there is more than one analogue-to-digital conversion in the connection to the ISP and is usually limited to a maximum line length between subscribers and the nearest exchange of 3 to 5 kilometres. In practice this means that for many users access speeds are less than the theoretical maximum. Typically in urban areas 33 Kbps is available but in many rural areas line quality is such that speeds fall well below this. For example, many rural areas are served with multi-access radio technology, which was introduced over ten years ago to eliminate party lines, and can only handle data transfer rates of 9.6 Kbps. ADSL (Asynchronous Digital Subscriber Line): This is one of a family of technologies referred to collectively as xDSL, a term covering different types of Digital Subscriber Lines. xDSL technologies use sophisticated modulation schemes to pack digital data onto copper wires. xDSL is similar to ISDN (see below) inasmuch as both operate over existing copper telephone lines, but requires short runs to a central telephone exchange (about 2 kilometres). Potentially, xDSL offers broadband level speeds - up to 32 Mbps for downstream traffic, and from 32 kbps to over 1 Mbps for upstream traffic. ADSL is offered by Telecom NZ to subscribers as JetStream. It is capable of speeds of up to (but generally much less than ) 6 Mbps in one direction and a much lesser speed in the other. It is currently available only in the main centres but is slated to be rolled out progressively throughout the country and is displacing ISDN (see below). However, this may be slow (after almost 20 years, ISDN still does not reach into most of rural New Zealand). Given that even inner city suburbs in areas such as Wellington cannot presently be serviced with ADSL, some further technology development will be required before ADSL can provide a widespread solution to bandwidth limitations, especially in rural areas. ISDN (Integrated Services Digital Network): Telecom provides ISDN in all the main centres as well as many of the smaller centres, however, with one or two exceptions it is not generally available in rural areas. The technology supports data transfer rates of from 64 Kbps to 2 Mbps. Basic Rate ISDN installations provide the equivalent of two standard telephone lines. One can be used for voice and the other for data, or both lines can be used to achieve data rates of 128 Kbps. This is just adequate for two-way video applications such as distance learning and video-conferencing. Multiple ISDN lines can be used to obtain higher quality connections, for example, three ISDN lines provide a connection speed of 384 Kbps and this is typically used where video quality is critical, for example for tele-medicine applications. Telecom NZ has demonstrated a reluctance to make further investment in its ISDN infrastructure, promoting ADSL services as a preferred alternative. Frame Relay: This is available from 64 Kbps and is easily scalable up to 2 Mbps. Frame relay is generally used as a dedicated point-to-point service or as a virtual private network and has the advantage of fixed price tariffs (no usage charges). The technology is specially suited to applications where guaranteed bandwidth is required such as for voice and video applications. Asynchronous Transfer Mode (ATM): This technology offers very high bandwidth, up to 150 Mbps and is typically used for linking corporate networks. IP Networking: This is a new family of services being piloted by Telecom NZ which is specially suitable for Internet connections where dedicated bandwidth is not critical. The focus is on flexibility and interconnectivity between a variety of connection services, such as dial-up telephone, ISDN and frame relay. Rural issues with the local loop While in some urban areas high bandwidth and even broadband access speeds are available, telephone subscribers more than 3 to 5 kilometres from an exchange are limited to access rates of 33 Kbps or less. In addition there are major problems with line quality reported for rural subscribers. A recent survey conducted for MAF reported 54% of rural subscribers as having problems affecting telephone lines including noise, electric fences (often a problem of poor fence installation by the farmer) and exchange overload. Telecom NZ reports that only 5% percent of local loop lines are not capable of maintaining a reliable data speed of 14.4 Kbps. The overwhelming majority of these would be in rural areas and thus this figure represents a large proportion of rural subscribers. An indicator of insufficient infrastructure capacity is the number of reported problems with obtaining a second or third telephone line (an obvious way of trying to bypass the data rate bottleneck). Over one third of survey respondents who indicated that they had attempted to get a second line failed to do so. Reference: •http://en.wikipedia.org/wiki/Telecommunications •http://images.google.com.ph/imgres?imgurl=http://www.window.state.tx.us/tspr/alamoccd/ex8-19.gif&imgrefurl=http://www.window.state.tx.us/tspr/alamoccd/ch08c.htm&usg=__QFFE6JleZKYmdBSFMxH9KzOBsc0=&h=873&w=762&sz=24&hl=tl&start=13&um=1&tbnid=h6u3xK_4vRUjgM:&tbnh=146&tbnw=127&prev=/images%3Fq%3Dinternet%2Bconnectivity%2Binfrastructure%26hl%3Dtl%26um%3D1 •http://searchdatacenter.techtarget.com/sDefinition/0,,sid80_gci212346,00.html •http://images.google.com.ph/imgres?imgurl=http://executive.govt.nz/minister/maharey/divide/images/fig-1.gif&imgrefurl=http://executive.govt.nz/minister/maharey/divide/03-01.htm&usg=__IQ9PipLjXwHzHU2MdP8YQefs6iI=&h=328&w=460&sz=21&hl=tl&start=3&um=1&tbnid=ozNXpdX5s5uT9M:&tbnh=91&tbnw=128&prev=/images%3Fq%3Dinternet%2Bconnectivity%2Binfrastructure%26hl%3Dtl%26um%3D1
Pls. visit and have some comments on my blog The Dreamer and Believer | |
| | | desiree
Posts : 26 Points : 28 Join date : 2009-06-22 Age : 32 Location : Davao City
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Aug 26, 2009 2:34 pm | |
|
Configuration Management
Configuration Management is a process that tracks all individual Configuration Items (CI) in a system.
Service Delivery The Service Delivery discipline is primarily concerned with proactive services the ICT must deliver to provide adequate support to business users. It focuses on the business as the customer of the ICT services (compare with: Service Support). The discipline consists of the following processes, explained in subsections below: * Service Level Management * Capacity Management * IT Service Continuity Management * Availability Management * Financial Management
Service Level Management Service Level Management provides for continual identification, monitoring and review of the levels of IT services specified in the service level agreements (SLAs). Service Level Management ensures that arrangements are in place with internal IT Support Providers and external suppliers in the form of Operational Level Agreements (OLAs) and Underpinning Contracts (UCs). The process involves assessing the impact of change upon service quality and SLAs. The service level management process is in close relation with the operational processes to control their activities. The central role of Service Level Management makes it the natural place for metrics to be established and monitored against a benchmark. Service Level Management is the primary interface with the customer (as opposed to the user, who is serviced by the Service Desk). Service Level Management is responsible for * ensuring that the agreed IT services are delivered when and where they are supposed to be * liaising with Availability Management, Capacity Management, Incident Management and Problem Management to ensure that the required levels and quality of service are achieved within the resources agreed with Financial Management * producing and maintaining a Service Catalog (a list of standard IT service options and agreements made available to customers) * ensuring that appropriate IT Service Continuity plans have been made to support the business and its continuity requirements. The Service Level Manager relies on the other areas of the Service Delivery process to provide the necessary support which ensures the agreed services are provided in a cost effective, secure and efficient manner.
Capacity Management Capacity Management supports the optimum and cost effective provision of IT services by helping organizations match their IT resources to the business demands. The high-level activities are Application Sizing, Workload Management, Demand Management, Modeling, Capacity Planning, Resource Management, and Performance Management.
IT Service Continuity Management IT Service Continuity Management is the process by which plans are put in place and managed to ensure that IT Services can recover and continue should a serious incident occur. It is not just about reactive measures, but also about proactive measures - reducing the risk of a disaster in the first instance. Continuity management is regarded as the recovery of the IT infrastructure used to deliver IT Services, but many businesses these days practice the much further reaching process of Business Continuity Planning (BCP), to ensure that the whole end-to-end business process can continue should a serious incident occur. Continuity management involves the following basic steps:
* Prioritizing the businesses to be recovered by conducting a Business Impact Analysis (BIA) * Performing a Risk Assessment (aka Risk Analysis) for each of the IT Services to identify the assets, threats, vulnerabilities and countermeasures for each service. * Evaluating the options for recovery * Producing the Contingency Plan * Testing, reviewing, and revising the plan on a regular basis
Availability Management Availability Management allows organizations to sustain the IT service availability to support the business at a justifiable cost. The high-level activities are Realize Availability Requirements, Compile Availability Plan, Monitor Availability, and Monitor Maintenance Obligations.
Availability Management is the ability of an IT component to perform at an agreed level over a period of time. * Reliability: how reliable is the Service? Ability of an IT component to perform at an agreed level at described conditions. * Maintainability: The ability of an IT Component to remain in, or be restored to an operational state. * Serviceability: The ability for an external supplier to maintain the availability of component or function under a third party contract. * Resilience: A measure of freedom from operational failure and a method of keeping services reliable. One popular method of resilience is redundancy. * Security: A service may have associated data. Security refers to the confidentiality, integrity, and availability of that data. Availability gives us the clear overview of the end to end availability of the system.
Financial Management for IT Services IT Financial Management is the discipline of ensuring that the IT infrastructure is obtained at the most effective price (which does not necessarily mean cheapest) and calculating the cost of providing IT services so that an organization can understand the costs of its IT services. These costs may then be recovered from the customer of the service.
Planning to implement service management The ITIL discipline - Planning To Implement Service Management attempts to provide practitioners with a framework for the alignment of business needs and IT provision requirements. The processes and approaches incorporated within the guidelines suggest the development of a Continuous Service Improvement Programme (CSIP) as the basis for implementing other ITIL disciplines as projects within a controlled programme of work. Planning To Implement Service Management is mainly focused on the Service Management processes, but is also generically applicable to other ITIL disciplines.
* create vision * analyze organization * set goals * implement IT service management
Security Management The ITIL-process Security Management [7] describes the structured fitting of information security in the management organization. ITIL Security Management is based on the code of practice for information security management also known as ISO/IEC 17799.
A basic concept of the Security Management is the information security. The primary goal of information security is to guarantee safety of the information. Safety is to be protected against risks. Security is the means to be safe against risks. When protecting information it is the value of the information that has to be protected. These values are stipulated by the confidentiality, integrity and availability. Inferred aspects are privacy, anonymity and verifiability. The current move towards ISO/IEC 27001 may require some revision to the ITIL Security Management best practices which are often claimed to be rich in content for physical security but weak in areas such as software/application security and logical security in the ICT infrastructure.
| |
| | | desiree
Posts : 26 Points : 28 Join date : 2009-06-22 Age : 32 Location : Davao City
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Aug 26, 2009 2:39 pm | |
| ICT InfrastructureManagementICT Infrastructure Management processes recommend best practice for requirements analysis, planning, design, deployment and ongoing operations management and technical support of an ICT Infrastructure. ("ICT" is an acronym for "Information and Communication Technology".)The Infrastructure Management processes describe those processes within ITIL that directly relate to the ICT equipment and software that is involved in providing ICT services to customers.* ICT Design and Planning* ICT Deployment* ICT Operations* ICT Technical SupportThese disciplines are less well understood than those of Service Management and therefore often some of their content is believed to be covered 'by implication' in Service Management disciplines.ICT Design and PlanningICT Design and Planning provides a framework and approach for the Strategic and Technical Designand Planning of ICT infrastructures. It includes the necessary combination of Business (and overall IS) strategy, with technical design and architecture. ICT Design and Planning drives both the Procurement of new ICT solutions through the production of Statements of Requirement ("SOR") and Invitations to Tender ("ITT") and is responsible for the initiation and management of ICT Programmes for strategic business change. Key Outputs from Design and Planning are:* ICT Strategies, Policies and Plans* The ICT Overall Architecture & Management Architecture* Feasibility Studies, ITTs and SORs* Business CasesICT Deployment ManagementICT Deployment provides a framework for the successful management of design, build, test and roll-out (deploy) projects within an overall ICT programme. It includes many project management disciplines in common with PRINCE2, but has a broader focus to include the necessary integration of Release Management and both functional and non functional testing.ICT Operations ManagementICT Operations Management provides the day-to-day technical supervision of the ICT infrastructure. Often confused with the role of Incident Management from Service Support, Operations is more technical and is concerned not solely with Incidents reported by users, but with Events generated by or recorded by the Infrastructure. ICT Operations may often work closely alongside Incident Management and the Service Desk, which are not-necessarily technical, to provide an 'Operations Bridge'. Operations however should primarily work from documented processes and procedures and should be concerned with a number of specific sub-processes, such as: Output Management, Job Scheduling, Backup and Restore, Network Monitoring/Management, System Monitoring/Management, Database Monitoring/Management Storage Monitoring/Management. Operations are responsible for:
* A stable, secure ICT infrastructure* A current, up to date Operational Documentation Library ("ODL")* A log of all operational Events* Maintenance of operational monitoring and management tools.* Operational Scripts* Operational ProceduresICT Technical Support
ICT Technical Support is the specialist technical function for infrastructure within ICT. Primarily as a support to other processes, both in Infrastructure Management and Service Management, Technical Support provides a number of specialist functions: Research and Evaluation, Market Intelligence (particularly for Design and Planning and Capacity Management), Proof of Concept and Pilot engineering, specialist technical expertise (particularly to Operations and Problem Management), creation of documentation (perhaps for the Operational Documentation Library or Known Error Database).
The Business Perspective The Business Perspective is the name given to the collection of best practices[9] that is suggested to address some of the issues often encountered in understanding and improving IT service provision, as a part of the entire business requirement for high IS quality management. These issues are:
* Business Continuity Management describes the responsibilities and opportunities available to the business manager to improve what is, in most organizations one of the key contributing services to business efficiency and effectiveness.* Surviving Change. IT infrastructure changes can impact the manner in which business is conducted or the continuity of business operations. It is important that business managers take notice of these changes and ensure that steps are taken to safeguard the business from adverse side effects.
* Transformation of business practice through radical change helps to control IT and to integrate it with the business.* Partnerships and outsourcingApplication ManagementITIL Application Management set encompasses a set of best practices proposed to improve the overall quality of IT software development and support through the life-cycle of software development projects, with particular attention to gathering and defining requirements that meet business objectives.
Software Asset Management Software Asset Management (SAM) is the practice of integrating people, processes and technology to allow software licenses and usage to be systematically tracked, evaluated and managed. The goal of SAM is to reduce IT expenditures, human resource overhead and risks inherent in owning and managing software assets.
SAM practices include:* Maintaining software license compliance* Tracking inventory and software asset use* Maintaining standard policies and procedures surrounding definition, deployment, configuration, use, and retirement of software assets and the Definitive Software Library.SAM represents the software component of IT asset management. This includes hardware asset management because effective hardware inventory controls are critical to efforts to control software. This means overseeing software and hardware that comprise an organization’s computers and network.
Small-Scale Implementation ITIL Small-Scale Implementation provides an approach to ITIL framework implementation for smaller IT units or departments. It is primarily an auxiliary work that covers many of the same best practice guidelines as Planning To Implement Service Management, Service Support, and Service Delivery but provides additional guidance on the combination of roles and responsibilities, and avoiding conflict between ITIL priorities.
Some of the information above is being supported by this link:http://en.wikipedia.org /wiki/Information_Technology_Infrastructure_Library#Overview_of_the_ITIL_v3_library
| |
| | | Alfredo V. Ala-an
Posts : 52 Points : 55 Join date : 2009-06-21 Age : 33 Location : Obrero Davao City
| Subject: ASSIGNMENT 6 Fri Aug 28, 2009 5:14 am | |
| If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)
Identifying first the terms:
Technology
Technology is a broad concept that deals with human as well as other animal species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. Technology is a term with origins in the Greek technología (τεχνολογία) — téchnē (τέχνη), 'craft' and logía (λογία), 'saying'.[1] However, a strict definition is elusive; "technology" can refer to material objects of use to humanity, such as machines, hardware or utensils, but can also encompass broader themes, including systems, methods of organization, and techniques. The term can either be applied generally or to specific areas: examples include "construction technology", "medical technology", or "state-of-the-art technology".
The human species' use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons.
Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, opining that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.
http://en.wikipedia.org/wiki/Technology
Infrastructure
Infrastructure can be defined as the basic physical and organizational structures needed for the operation of a society or enterprise, [1] or the services and facilities necessary for an economy to function. [2] The term typically refers to the technical structures that support a society, such as roads, water supply, sewers, power grids, telecommunications, and so forth. Viewed functionally, infrastructure facilitates the production of goods and services; for example, roads enable the transport of raw materials to a factory, and also for the distribution of finished products to markets. In some contexts, the term may also include basic social services such as schools and hospitals [3]. In military parlance, the term refers to the buildings and permanent installations necessary for the support, redeployment, and operation of military forces [4].
In this article, infrastructure will be used in the sense of technical structures or physical networks that support society, unless specified otherwise.
http://en.wikipedia.org/wiki/Infrastructure
Innovations
The term innovation refers to a new way of doing something. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. Following Schumpeter (1934), contributors to the scholarly literature on innovation typically distinguish between invention, an idea made manifest, and innovation, ideas applied successfully in practice. In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy.
Innovation is an important topic in the study of economics, business, design, technology, sociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth.
Those who are directly responsible for application of the innovation are often called pioneers in their field, whether they are individuals or organizations.
http://en.wikipedia.org/wiki/Innovation
Steps/Processes
Steps was a British pop group that achieved a series of charted singles between 1997 and 2001. Their name was based around a simple marketing gimmick: each of their music videos were choreographed, and the dance steps were included with most of their single releases. Steps formed on 7 May 1997 and disbanded on 26 December 2001. In total Steps have sold over 15 million records, and achieved 14 consecutive Top 5 singles - a feat only surpassed by Oasis (with 18) and The Beatles and Westlife (who both had 22).
http://en.wikipedia.org/wiki/Steps_(group)
Process(in computing)
In computing, a process is an instance of a computer program, consisting of one or more threads, that is being sequentially executed[1] by a computer system that has the ability to run several computer programs concurrently.
A computer program itself is just a passive collection of instructions, while a process is the actual execution of those instructions. Several processes may be associated with the same program; for example, opening up several instances of the same program often means more than one process is being executed. In the computing world, processes are formally defined by the operating system (OS) running them and so may differ in detail from one OS to another.
A single computer processor executes one or more (multiple) instructions at a time (per clock cycle), one after the other (this is a simplification; for the full story, see superscalar CPU architecture). To allow users to run several programs at once (e.g., so that processor time is not wasted waiting for input from a resource), single-processor computer systems can perform time-sharing. Time-sharing allows processes to switch between being executed and waiting (to continue) to be executed. In most cases this is done very rapidly, providing the illusion that several processes are executing 'at once'. (This is known as concurrency or multiprogramming.) Using more than one physical processor on a computer, permits true simultaneous execution of more than one stream of instructions from different processes, but time-sharing is still typically used to allow more than one process to run at a time. (Concurrency is the term generally used to refer to several independent processes sharing a single processor; simultaneously is used to refer to several processes, each with their own processor.) Different processes may share the same set of instructions in memory (to save storage), but this is not known to any one process. Each execution of the same set of instructions is known as an instance— a completely separate instantiation of the program.
For security and reliability reasons most modern operating systems prevent direct communication between 'independent' processes, providing strictly mediated and controlled inter-process communication functionality.
http://en.wikipedia.org/wiki/Process_(computing)
Idea:
If i were hired as an It consultant in our school the first step I will do to improve the internet connection is the infrastructure. Because as we all know proper infrastructure in a school is a great aspect that schools should have to proper knowledge to teach students.Because, how can you use technology without the proper infrastructure ? Proper infrastructure can give the students a refreshing mind in studying, a better thinking to pass subjects or improve the grades. So I suggest the infrastructure first is to be done.
The second thing that i would suggest to be implemented is the technology. Because in the first step that we had we set the proper infrastructure in order to use the technology. It is just like, in my mind the two things I would set up first was the infrastructure and the technology. Technology leads to knowledge and knowledge leads to power. Getting the new technology can give a great idea to the students to know to think the future technology that they can produce and because of that gaining knowledge is starting. Power? because the knowledge that the god given us is really a big thing in our live to have. We can see that those people who are in the top of their success was the people who have the knowledge.
The third thing i want to suggest was the innovation. Innovation in the sense that changing new technology. Because as we all know technology is really a fast changing. So innovations is really applied in the steps or process here.
Last edited by Alfredo V. Ala-an on Thu Oct 01, 2009 1:38 am; edited 1 time in total | |
| | | Sheila Capacillo
Posts : 57 Points : 57 Join date : 2009-06-21 Age : 33 Location : Davao City
| Subject: assignment 6 Fri Aug 28, 2009 11:55 am | |
| If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)
If I were hired to be an It consultant of the university, I would propose the following: The very first step to improve your wireless signal is to position the router in a central location. This will ensure that regardless of where you are in the structure, you will be within a range that is optimal for fast connectivity.
If you take a look at your router, you will likely discover that there is a small antenna attached to the unit. This antenna assists in projecting a wireless signal in an omni-directional manner. If you want to improve your wireless internet connection, it is important to consider purchasing a large antenna that can be attached to your router. If your computer is in a specific location, avoid the omni-directional based antennas and purchase an antenna that is considered to be "high gain". These types of antennas point the internet connectivity signal in any direction of your choice and drastically enhance the overall strength of the signal. As a result, when you are online, you will find that you can browse quicker and engage in uploading and downloading in a faster manner.
Are there many individuals that wish to have a strong wireless internet connection in your home and/or business and in school. If so, it may be necessary to consider an investment into special devices that are called "repeaters". These devices work to deliver the existing wireless internet signal to several other devices within the structure. If you have a large number of individuals online within a given structure, it is considered appropriate to implement the use of several repeaters. This will ensure that each internet user is experiencing optimal speeds when it comes to surfing the internet, viewing media online, downloading, and even uploading. Wireless technology makes it easy to get rid of the cables and take computing away from the desk. It's becoming the household norm, and while the technology is advancing quickly, there are some easy things you can do to improve your own wireless connectivity.
5 steps to improve connectivity:
-Position Your Router A wireless signal doesn't carry far, and any walls or large objects may cause interference. For this reason, a wireless router should be centrally located in your home to insure the best range possible. Place the router on a flat surface off the floor and away from obstructions. Additionally, there could be interference from a neighboring wireless signal. Make sure that you're using a unique wireless channel to limit interference.
-Replace your Antenna The antennas shipped with most routers are small antennas with omni-directional capabilities. These antennas broadcast a signal in all directions, which can be useful if you need wireless throughout your house, but the range is quite short. A directional antenna can improve range by focusing the signal in a specific way, allowing you to aim it where it's needed. These antennas are often called "high-gain" and the signal increase is measured in decibels (dB).
-Get a Repeater A wireless repeater is the easy and safe way to boost your signal. A repeater works very much like a router, but instead of creating a signal, it relays an existing signal. A repeater is easy to install and doesn't require any additional wires or connections. Multiple repeaters make it easy to create a home or business network with complete connectivity.
-Get an Antenna Booster It's possible to make a homemade reflector or antenna to improve your wireless signal. There are templates and building instructions on many Web sites across the internet to use materials as commonplace as foil and cardboard. Common designs are a parabolic satellite shape and a "coffee can" yagi antenna. Both can increase range and direct your signal, though homemade quality will vary.
-Upgrade Firmware Router manufacturers publish firmware updates regularly and upgrading your router can provide a performance boost and access to new features. Another option for the tech savvy is to install third-party firmware. There are a number of free, safe alternatives that may be compatible with your router. One project, DD-WRT, offers more robust features than many of the official firmware packages. Using these tips, you should be able to squeeze every bit of connectivity out of your own home network. Check out HowStuffWorks' other articles on home networks to learn more.
http://computer.howstuffworks.com/improving-wireless-connection.htm
wanna visit my blog?
http://shecapacillo.blogspot.com[list][*]
Last edited by Sheila Capacillo on Thu Oct 01, 2009 4:52 am; edited 1 time in total | |
| | | kristine_delatorre
Posts : 58 Points : 60 Join date : 2009-06-21 Age : 33 Location : davao city
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Fri Aug 28, 2009 12:36 pm | |
| <<<<<<<<<<<UNDER CONSTRUCTION>>>>>>>>> | |
| | | AlyssaRae Soriano
Posts : 38 Points : 39 Join date : 2009-06-20 Age : 34
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Sat Aug 29, 2009 2:47 pm | |
| If I was hired as an IT consultant in the university, before giving suggestions to improve the internet connection, I would first state some possible reasons why the internet connection fails. (This is according to Bob Ranklin. )
“There are several possible reasons why your Internet connection might suddenly stop working. If you were on dialup, the most likely scenario would be noise on the phone line, but since you have a high-speed cable connection, we can rule that out.
In a recent Ask Bob Rankin article Do Computers Get Tired? I addressed the subject of electronic devices that fail at random times, and gave some scientific basis for occasionally turning the device off and then back on. I have a cable modem as well, and have found that sometimes a slow or dropped Internet connection is restored simply by unplugging the cable modem, waiting 30 seconds and powering it back on. Not pretty, but it works. If the problem is happening every day, though, it might be better to replace the modem.
There could also be a software-related issue which is causing your Internet connection to fail. If you have other computers on a home network, and they have no trouble getting online, then I'd cast a wary glance in the direction of your firewall. Firewalls are designed to block certain Internet connections, so it's entirely possible that a bug in the firewall software is erroneously shutting down ALL network connections. You may even have told the firewall to do this without meaning to.
Open your firewall's configuration screen and check to see what programs are being blocked from connecting to the Internet. If nothing obvious appears to be erroneously blocked, try shutting down or uninstalling the firewall software, then reboot and see if the problem persists. If that fixes the problem, consider ditching the software-based firewall, especially if you have a router between your computer and the cable modem. Routers have hardware-based firewalls built in, which makes firewall software superfluous for most users.
To uninstall the firewall software, click on the Start button, open Control Panel, then Add/Remove Programs. Find the firewall in the list and click the Remove button. Note that your firewall may be bundled with an anti-virus of internet security package. If that's the case, click on that package (ex: eTrust EZ-Armor or Norton Internet Security), and make sure you select ONLY the firewall for removal, leaving the anti-virus protection in place.
If none of those things helps, report the problem to your Internet provider. If the problem has something to do with your modem or the cabling in your neighborhood, it might be affecting your neighbors as well.” – Bob Ranklin
After stating some possible reasons why the internet connectivity fails, here are some of the steps, tips, and some warnings in maximizing your internet connection. (Internet based.)
STEPS: 1. Do some basic maintenance on your PC. Run Disk Defrag, a scan disk, a virus scan, a malware scan, and clear your recycle bin. An unusually slow Internet connection experience is often the only sign that your computer is infected with viruses or other malware. Delete old files and temporary files. Never allow the free space on your C: drive to be less than 10% of the total size or twice the installed RAM (which ever is larger). A well maintained PC will operate much better than a PC that has never had any maintenance. Google or your local computer repair store should be able to help you with this if you don't know how.
2. Reset Your Home Network. Sometimes restarting your home network if you have one will drastically increase the speed of your connection.
3. Optimize your cache or temporary Internet files. These files improve your Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page your computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. if you disable the cache, it must be downloaded every time you view a page that uses it. This can be done by opening Internet Explorer, clicking on "Tools" at the top and choosing "Internet Options". On the General tab, click the "Settings" button next to Temporary Internet Files. Set Check for newer versions to "Automatically". Set amount of disk space to use to 2% of your total disk size or 512 MB, which ever is smaller. On Firefox, click "Tools" then "Options," and go to the privacy tab. Then click on the Cache tab within this.
4. Never bypass your router. Most routers include a firewall that is very difficult for hackers to defeat. If you don't need to use Wireless then hook your computer directly to your router. Routers will only slow down your connection by a few Milli-seconds. You won't notice the difference but the hackers will.
5. If you are using a Wireless router, make sure it doesn't conflict with a cordless phone or wireless camera. Wireless routers come in two varieties; 802.11bg (2.4Ghz) or 802.11a (5.8Ghz) If you are using a 2.4Ghz Cordless phone and 2.4Ghz Wireless router then your Internet connection speed will slow while you use the cordless phone. The same is true of wireless security cameras. Check on your phone and camera, if it's 900Mhz then it's fine. If it says 2.4Ghz or 5.8Ghz then it could be the cause of your slow connection speed while they're in use.
6. Call your Internet service provider (ISP). Sometimes you just have bad service. They can usually tell if your connection is substandard without having a technician come to your home. Just be nice and ask.
7. Upgrade your computer. If your computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to.
8. Replace your old cable modem. Any solid-state electronics will degrade over time due to accumulated heat damage. Your broadband modem will have a harder and harder time 'concentrating' on maintaining a good connection as it gets older (signal to noise ratios will go down, and the number of resend requests for the same packet will go up). An after-market cable modem as opposed to a cable-company modem will frequently offer a better connection.
9. Often your connection speed is slow because other programs are using it. To test if other programs are accessing the Internet without your knowing, Click Start, Click Run. Type "cmd" (without quotes). Type "netstat -b 5 > activity.txt". After a minute or so, hold down Ctrl and press C. This has created a file with a list of all programs using your Internet connection. Type activity.txt to open the file and view the program list. Ctrl Alt Delete and open up the Task Manager. Go to the process menu and delete those processes that are stealing your valuable bandwidth. (NOTE: Deleting processes may cause certain programs to not function properly)
10. After you have tried all this try your connection again and see if it's running any faster.
TIPS: 1. Call your ISP and have them verify all of your TCP/IP settings if you are concerned. Ask them to verify that your Proxy settings are correct.
2. Don't expect dial up or high speed lite service to be fast. The Internet is primarily geared towards Broadband Connections. Sometimes, you have to wait a little.
3. Download programs that make browsing faster: - Loband.org is a browser inside of a browser that loads web pages without the images. -Firefox and Opera both have options to disable images. -In Firefox, you can also use extensions such as NoScript that let you block scripts and plug-ins that would otherwise slow things down a lot. -If you are using Internet Explorer or Firefox, try downloading Google Web Accelerator. It is meant to speed up broadband connections, but it can also slow your Internet connection. Try enabling it and disabling it and see when your Internet connection runs faster. -If you are using Firefox, download the Fasterfox extension and Firetune. -Reduce the amount of programs running that use your Internet connection (Instant Messengers, RSS Feeders, and MS Applications set to send Internet data) -Google Accessible Is designed to search pages in order of how clean they are of junk. This will bring up pages that are usually not only easy to read, but are quick to load.
4. Upgrade your RAM. This will not only improve your regular computer use, but it will affect the speed of your Internet connection because your computer works faster.
5. Use the Stop button to stop loading pages once you've gotten what you want. 6. Some times malware on your computer can eat up your bandwidth. Make sure you have an up-to-date malware protection program.
7. Most Internet Providers have flaky DNS servers (no citation necessary, it's a given) - so, instead of using those provided by your ISP, switch your DNS servers to use those of OpenDNS. OpenDNS is far faster, and more reliable, simply using 208.67.222.222 and 208.67.220.220 as your domain name servers will speed up most flaky DNS problems (may even speed up your networking since OpenDNS has large caches).
8. Look into running your own local DNS server on your network. Some newer routers may include their own nameserver, otherwise, check into AnalogX.com's DNSCache program, it works great to hold commonly accessed domain names in the "cache" so that the IP addresses do not have to be looked up everytime you navigate to a new page.
WARNINGS: 1. Viruses and malware can often use up your bandwidth and slow down your Internet connection. Make sure you have protection against this. Many ISP's will provide software for this. Make sure your anti-virus and malware scanners are up-to-date.
2. Bypassing the router will leave you more vulnerable to attacks because you no longer have the built-in firewall from your router protecting you.
3. Watch out for scams that claim to make your Internet go a lot faster for free. They may tell you to download their program, which usually has a lot of other hidden programs attached that might steal your identity.
http://askbobrankin.com/internet_connection_fails.html http://www.wikihow.com/Maximize-the-Speed-of-Your-Internet-Connection | |
| | | alma cabase
Posts : 56 Points : 58 Join date : 2009-06-20
| | | | neil rey c. niere
Posts : 55 Points : 58 Join date : 2009-06-19 Age : 32
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Mon Aug 31, 2009 3:58 am | |
| (2nd post)continuation.... How Internet Infrastructure Works? It is a global collection of networks, both big and small. These networks connect together in many different ways to form the single entity that we know as the Internet. In fact, the very name comes from this idea of interconnected networks. Since its beginning in 1969, the Internet has grown from four host computer systems to tens of millions. However, just because nobody owns the Internet, it doesn't mean it is not monitored and maintained in different ways. The Internet Society, a non-profit group established in 1992, oversees the formation of the policies and protocols that define how we use and interact with the Internet. Internet Infrastructure consists of these areas: NetworkA group of two or more computer systems linked together. There are many types of computer networks, including: • local-area networks (LANs) : The computers are geographically close together (that is, in the same building). • wide-area networks (WANs) : The computers are farther apart and are connected by telephone lines or radio waves. • campus-area networks (CANs): The computers are within a limited geographic area, such as a campus or military base. • metropolitan-area networks MANs): A data network designed for a town or city. • home-area networks (HANs): A network contained within a user's home that connects a person's digital devices. In addition to these types, the following characteristics are also used to categorize different types of networks: • topology : The geometric arrangement of a computer system. Common topologies include a bus, star, and ring. See the Network topology diagrams in the Quick Reference section of Webopedia. • protocol : The protocol defines a common set of rules and signals that computers on the network use to communicate. One of the most popular protocols for LANs is called Ethernet. Another popular LAN protocol for PCs is the IBM token-ring network . • architecture : Networks can be broadly classified as using either a peer-to-peer or client/server architecture. Computers on a network are sometimes called nodes. Computers and devices that allocate resources for a network are called servers. Possibly to most important foundation block of Internet Infrastructure is the Network. Without a network connection no data can pass between Data Centres, over the Internet, and ultimately onto your Desktop, Laptop or Mobile Handset. For the purpose of this post, let’s talk about the network infrastructure in a Data Centre, where data passed in to computer equipment, is processed and/or stored, and passed back out of the DC. So you would expect at least N 1 network connectivity into a Data Centre in the form of at least 2 Fibre Cables from telecommunications providers on diverse rings. Therefore if one had service cut, the Data Centre’s network connection would not be affected. Some data centres (Hosting365’s is one) are Carrier Neutral - which means a number of carriers have a Point-Of-Presence in the facility, so the Data Centre is not affected by any commercial or technical issues of a single carrier. Next you would expect redundant switch gear in the Data Centre in separate racks so again if the switch gear failed, the other set of it would simply take over and no service interruption would be experienced. The unit of measurement for network connectivity is megabits per second and available megabits on the carrier connection. There may be 1 Gigabit available but the DC may only be using, and paying for, 100 megabits. The ability to meet peak demand is important though, so Data Centres will have a lot more connectivity available than is required for daily operations. Storage ServicesData Storage is a huge part of Internet Infrastructure. All those emails accessible online, all the web pages on your favourite web site, all those photos on Facebook … are all stored on a hard drive in a DC somewhere. The basic level of storage is on-server storage, which means the hard drives in the computer server. This can cause not just performance and capacity issues, but also redundancy ones - local storage is inherently as prone to failure as the server it is in. It is common to use specific storage devices - such as Direct Attached Storage (a dedicated and dumb storage appliance connected direct to your server), Network Attached Storage (a storage device that can be accessed by multiple machines over a network connection, and independent of the server itself) and Storage Area Networks, which are high-end, resilient and redundant set-ups that give high performance levels and are very scalable. A Storage Area Network may be shared among many services, applications, servers and customers. The unit of measure in storage is gigabytes (getting to be more commonly terabytes now) and IO’s per second (input-output read/writes the device can perform per second). Computer Equipment Now that the two basics of Internet Infrastructure are in place - the ability to power your equipment, and the ability to connect it to the Internet, the next thing is the computer hardware that uses this to process and store the applications and data. By computer equipment, for this basic post, I really mean Servers. A Server is a more complex and high-end version of a desktop PC. An average server might consist of 2 power supplies (for redundancy), 8-12 RAM slots, anything from 2-10 hard drive bays and multiple processors (not just multi-core!). Servers are housed in Racks in a DC which are typically 42u in height. (1U is 1-unit and a low-end server takes up just 1 of these units, other servers scale within these racks to multiple ‘U’). Racks are normally powered by 2 PDU (Power Distribution Units) which connect to (if available) multiple power supply units in the server. A low-end installation may be only a single server, which is the simplest form of Internet Infrastructure. The server would be connected to the DC Power, the Network, an OS and other required applications installed on it. Then it is ready to ‘power and push’ data on the Internet. More complex deployments would include pools of servers, with different applications on each one, or clusters of pools for multiple clusters with dedicated application requirements. The unit of measure for Servers is Processor Power and RAM. Although there is a lot more to selecting a server such as expandability, reliability, network ports, BUS speed, Cache size and speed. Personally I would like the unit of measure in Servers to change, I think for buyers and users it should be rated in ‘MIPS’ - which is ‘Millions of Instructions Per Second’ which is effectively all that matters, and how today’s Mainframe computers (IBM BlueGene is a high end Mainframe) are measured. Why recycle computer equipment? Computer equipment recycling reduces the volume of waste which ends up in landfill sites, or gets dumped illegally. It cuts down on the amount of raw materials needed for the manufacture of new products, and it also means more efficient and convenient recycling for the end user. In addition, if computing equipment is refurbished, this can benefit people and organisations that cannot afford to buy new IT equipment. It is possible to recycle many parts of an IT system, particularly monitors, PCs and servers. Computer peripherals, such as printers and scanners, can also be recycled, as can landline and mobile phones. However, some elements of an IT system may need particular expertise to recycle, with PCs, for example, tending to have heavy metals in their circuit boards. Service DesignService design is the specification and construction of technologically networked social practices that deliver valuable capacities for action to a particular customer. Capacity for action in Information Services has the basic form of assertions. In Health Services, it has the basic form of diagnostic assessments and prescriptions (commands). In Educational Services, it has the form of a promise to produce a new capacity for the customer to make new promises. In a fundamental way, services are unambiguously tangible. Companies such as eBay, or collectives such as Wikipedia or Sourceforge are rich and sophisticated combinations of basic linguistic deliverables that expand customers' capacities to act and produce value for themselves and for others. In an abstract sense, services are networked intelligence. Service design can be both tangible and intangible. It can involve artefacts and other things including communication, environment and behaviours. Several authors (Eiglier 1977; Normann 2000; Morelli 2002), though, emphasise that, unlike products, which are created and “exist” before being purchased and used, service come to existence at the same moment they are being provided and used. While a designer can prescribe the exact configuration of a product, s/he cannot prescribe in the same way the result of the interaction between customers and service providers, nor can s/he prescribe the form and characteristics of any emotional value produced by the service. Consequently, service design is an activity that suggests behavioural patterns or “scripts” to the actors interacting in the service, leaving a higher level of freedom to the customers’ behaviour. http://ezinearticles.com/?IT-Consultation:--Do-You-Have-the-Required-Business-Skills?&id=290641 http://www.prospects.ac.uk/p/types_of_job/it_consultant_job_description.jsp http://www.howstuffworks.com/internet-infrastructure.htm http://en.wikipedia.org/wiki/Service_design#Characteristics_of_Service_Design totally done!... | |
| | | alma cabase
Posts : 56 Points : 58 Join date : 2009-06-20
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Tue Sep 01, 2009 3:31 pm | |
| If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)If I was hired as the university IT consultant, the first thing that I would do before giving my suggestions is to observe and study the current situation of the university in terms of its budget pertaining to IT-related projects and through these assessments I could come up with more realistic suggestions.
If the university have enough budget for IT-related innovations/upgrades these will be likely my suggestions in terms in: TECHNOLOGYIn terms of technology, the university is not far behind compared to the private colleges and other universities here in Davao. But if we talk of universities in Asia. We are hundred miles behind from them. I have researched and gathered comparison on the different technologies that our university and other universities outside the Philippines are using. I have chosen the University of Hong Kong as to where our university will be compared to, not only because the said university is the top university in Asia for 3 years till now, but mainly because the said school boast its technologies as the best not only in Asia but in the world. Here is the comparison: UNIVERSITY OF HONGKONG(Computer Centre) Computers & Connections:
-46 Core 2 Duo (2.66GHz) 3GB PCs with 19" wide screen LCD display panels & DVD +/-R/RW writer drives -25 Core 2 Duo (2.66GHz) 2GB PCs with 19" LCD wide screen display panels & DVD +/-R/RW writer drives (one PC for class instructor) -21 Core 2 Quad (2.33 GHz) 4GB PCs with 19" LCD display panels & DVD+/-R/RW drives (one PC for class instructor) -18 ACEnet connection points for connection of notebook computers -60 Core 2 Quad (2.33 GHz) 4GB PCs with 19" LCD display panels & DVD+/-R/RW drives -30 Core 2 Duo (2.13GHz) 2GB PCs with 17" LCD display panels & DVD+/-R/RW drives (one PC for class instructor) -30 Core 2 Duo (2.13GHz) 2GB PCs with 17" LCD display panels & DVD +/-R/RW writer drives (one PC for class instructor) -HKU High Performance Computing Cluster In the academic and research arena, High Performance Computing (HPC) facilities are heavily used for solving computational problems that are not flexible to be done using conventional computers due to the huge amount of CPU power, memory, network and disk space requirements. In order to facilitate intensive computations, the Computer Centre has set up a High Performance Computing Cluster, namely hpcpower.hku.hk in October 2003. Another 64-bit Linux cluster, namely hpcpower2.hku.hk is in service to augment the existing 32-bit HPCPOWER cluster system at late 2008. HKU Grid Facilties * hpcpower2: 64-bit Linux cluster consists of 24 nodes - + each node has TWO 64-bit quad-core Intel Xeon CPUs running at 3GHz * hpcpower: 32-bit Linux cluster consists of 178 nodes - + 128 nodes of dual 2.8 GHz Xeon processors, and + 50 nodes of dual 3.06 GHz Xeon processors * winhpc: first Microsoft Windows based High Performance Computing Cluster in Hong Kong * condor: Condor pool in PC laboratory Printers- 3 networked HP LaserJet 4350 black & white duplex laser-printer (duplex printing by default) - 1 networked HP LaserJet 4350 black & white duplex laser-printer (single side printing by default) - 1 networked HP LaserJet 4515x black & white duplex laser-printer (duplex printing by default) - 1 networked HP LaserJet 5500DTN color single laser-printer - 1 networked Fuji Xerox black & white laser-printer (single side by default) - 3 networked HP LaserJet 4350 black & white duplex laser-printer (duplex printing by default) - 1 networked HP LaserJet 4350 black & white duplex laser-printer (single side printing by default) - 1 networked HP LaserJet 4515x black & white duplex laser-printer (duplex printing by default) - 1 networked HP LaserJet 5500DTN color single laser-printer - 1 networked Fuji Xerox black & white laser-printer (single side by default) Special Facilities- Color scanner (in RR-104) - RealVideo & VCD Production PC (in RR-104) - DVD Production PC (in RR-104) - Optical Mark Reading (OMR) Scanner System (in RR-104) USEP(COMPUTER LABORATORIES)I have been researching about the facilities that our university have for an hour and I haven't found any page that enumerates and discusses each of them one by one. Because of this, the records below are based on my observations as a current student in the university. Computers & Connections:-20 P4 (1.3-1.6 GHz) 256MB-512MB PCs with 15" CRT display monitors about 10 with 15" LCD display panels -12 AMD Athlon (2.0-2.6 GHz) 512MB-1G PCs with 15" CRT display monitors -36-64 port switch hub(networking in lab 1) Printers-1 Dot Matrix Printer -1 Laser Printer(Nodal) Special Facilities-(none) As you may see for yourselves our university is far behind form the later. This comparison is not meant to criticize our university but to examine on the points that our university failed to examine. If given enough budget I would definitely suggest for a total system upgrade. The processors that the computers in our computer labs have are very out dated. It would be a good thing to at least upgrade all the computers to have Dual Core Processors running with 1G memory. The display monitors can be left as it is. The important thing is that the computing power of the units will be increased so the the students could be more aware and could practice the latest software that is available these days. I would also suggest to have at least 3 more printers. As a student myself, I'm spending a lot of money in printing my proposals and other school-related papers. But if printers are available within the school, it would be very convenient for the students. It could also be a good source of funds for the university. In terms of purchasing special facilities, at this point in time, It would be very unwise for the university to purchase such advance technologies. Not only because they are very expensive, but mainly because as we look to the current course offerings of theuniversity in terms of IT-related courses, these special facilities are not of great relevance.
INFRASTRUCTUREWe may not know it but IT infrastructure is a very vital point in the success of a business or a program with a direct IT-relation. I have browsed the internet and found these interesting articles about the importance and difficulties in IT infrastructure and based on these I will derive my suggestions pertaining to the IT infrastructure of our university. Infrastructure: IT's stepchild By Bart Perkins September 22, 2008 12:00 PM ET
Computerworld - Every enterprise needs a robust IT infrastructure in order to function effectively. Infrastructure is the foundation of corporate productivity and success. Many IT groups, however, don't have enough skilled infrastructure staffers to provide the solid foundation required. Unfortunately, qualified infrastructure people are hard to find. Here's why: Applications are more highly valued. Most executives recognize that effective applications offer significant business value. Unfortunately, they usually assume that the underlying infrastructure is easy to construct and maintain. As a result, they often give less attention and recognition to infrastructure. (Even CIOs generally understand applications better than infrastructure.) Infrastructure is increasingly complex. The infrastructure group now manages a number of new technologies, including virtualization, advanced networking and cloud computing. In addition, infrastructure frequently has primary responsibility for privacy, security and standards. As the biggest energy consumer, infrastructure is also responsible for "green" initiatives, such as cutting IT energy use and complying with hazardous-substances mandates. All the pieces must then be knit together efficiently. As a result, infrastructure jobs require far more technical breadth and depth than ever before. Infrastructure is becoming more customer-focused. With the advent of software as a service, outsourcing and application software generators, IT needs fewer technical specialists. But infrastructure functions now require high levels of customer contact, because of ITIL v3's focus on customer service. Many technical staffers (often introverted, per the stereotype) are uncomfortable with this requirement. Compensation is lower. Historically, infrastructure departments offered entry-level IT jobs to individuals without college degrees. HR justified paying them lower salaries by claiming that they had fewer technical skills than their applications counterparts. Even though most low-skill infrastructure jobs have been automated and eliminated, perceptions have been slow to change. Compensation plans have not been adjusted to reflect the higher levels of technical expertise infrastructure now requires. Infrastructure is a thankless job. Unfortunately, many employees have a very limited understanding of infrastructure. Few people appreciate the difficulty of the preproduction testing or postproduction tuning associated with installing a new system. This lack of understanding often leaves infrastructure staffs feeling undervalued and underappreciated. When the servers are up and the network is functioning, infrastructure availability is taken for granted. But when work stops because an application is unavailable or the network goes down, all fingers point to infrastructure. Infrastructure gets attention only for failures. Infrastructure education is insufficient. Few U.S. colleges offer IT courses covering infrastructure functions. In addition, most high school and college career counselors advise students that there are more job opportunities in applications than in infrastructure. Moreover, the head of applications is more often promoted to CIO than the head of infrastructure, so the long-term career path is not very appealing. The result is a shortage of qualified people pursuing infrastructure careers. IT's infrastructure organization requires increasing levels of technical skills to deal with the complex and constantly shifting work environment. But lack of appreciation, lower compensation and a limited career path make it difficult to attract and retain qualified professionals. Infrastructure staffers need to be treated as invaluable employees who are critical to the success of the enterprise, because in today's IT environment, they really, really are. As I read the article above, It is very obvious for a change in IT infrastructure to be very risky. We might be blinded on the positive effects that it may bring to our system but if a couple of mistakes may cause a disaster in the overall operation in the university. The article stated about requiring skilled IT personnel for considering IT infrastructure change. This is very true. And as I observed, our university lack these. We may have the option to hire more skilled IT practitioners but it would only mean more expenses for the university. The best thing to do in my opinion in terms of IT infrastructure in our university is to maintain it. It is not necessary to change it to a more complex one at present. The important thing is it is working without any alarming glitches. Time will come that It is necessary to upgrade or change the present structure, but as I have said today is not that time.
| |
| | | alma cabase
Posts : 56 Points : 58 Join date : 2009-06-20
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Tue Sep 01, 2009 3:57 pm | |
| (continuation) STEPS*Check your web browser to see if you haven’t made it complicated and difficult for them through different network connections. For that, open up the Internet Explorer and click on “Tools” option at the top left hand. Then select “Internet Options” tab followed by “Connections.” Now click the “LAN settings” button and see if you find any check marks in any of the options listed. If you see any checkmarks, please deselect them. Then click Ok or Apply button. *If you have come here to find a software or tool to download that will enhance your web speed, consider yourself lucky. Because there is a free tool that you can download from an entrusted source Google; it is called Google Web Accelerator. And it does speed up you surfing speed, making it seem like as if you have made your internet connection faster. It is just another internet speed booster. *Do some basic maintenance on your PC. Run Disk Defrag, a scan disk, a virus scan, a malware scan, and clear your recycle bin. An unusually slow Internet connection experience is often the only sign that your computer is infected with viruses or other malware. Delete old files and temporary files. Never allow the free space on your C: drive to be less than 10% of the total size or twice the installed RAM (which ever is larger). A well maintained PC will operate much better than a PC that has never had any maintenance. Google or your local computer repair store should be able to help you with this if you don't know how. *A technical strategy to boost the speed of your internet would be to either rebuild your computer’s Winsock or use Tweaktester. Winsocks or Windows Sockets is what your Windows computer use to control the input and output datas. But they can sometimes be congested ad damaged to some extent by Spywares and softwares. This happens every day using normal usage. So sometimes professionals reset it using Winsock utility softwares to rebuild them. On the other hand, Tweaktester tool works with Recieve Window of your XP operating system. By default Recieve Window is set to a value much too low for today’s modern high-speed Internet demands. So you can change this to a larger number that improves the internet performance. There is also something called OpenDNS that is widely proclaimed as a good internet speed booster *Reset Your Home Network. Sometimes restarting your home network if you have one will drastically increase the speed of your connection. *Optimize your cache or temporary Internet files. These files improve your Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page your computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. if you disable the cache, it must be downloaded every time you view a page that uses it. This can be done by opening Internet Explorer, clicking on "Tools" at the top and choosing "Internet Options". On the General tab, click the "Settings" button next to Temporary Internet Files. Set Check for newer versions to "Automatically". Set amount of disk space to use to 2% of your total disk size or 512 MB, which ever is smaller. On Firefox, click "Tools" then "Options," and go to the privacy tab. Then click on the Cache tab within this. *Never bypass your router. Most routers include a firewall that is very difficult for hackers to defeat. If you don't need to use Wireless then hook your computer directly to your router. Routers will only slow down your connection by a few Milli-seconds. You won't notice the difference but the hackers will. *If you are using a Wireless router, make sure it doesn't conflict with a cordless phone or wireless camera. Wireless routers come in two varieties; 802.11bg (2.4Ghz) or 802.11a (5.8Ghz) If you are using a 2.4Ghz Cordless phone and 2.4Ghz Wireless router then your Internet connection speed will slow while you use the cordless phone. The same is true of wireless security cameras. Check on your phone and camera, if it's 900Mhz then it's fine. If it says 2.4Ghz or 5.8Ghz then it could be the cause of your slow connection speed while they're in use. *Call your Internet service provider (ISP). Sometimes you just have bad service. They can usually tell if your connection is substandard without having a technician come to your home. Just be nice and ask. *Upgrade your computer. If your computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to. *Replace your old cable modem. Any solid-state electronics will degrade over time due to accumulated heat damage. Your broadband modem will have a harder and harder time 'concentrating' on maintaining a good connection as it gets older (signal to noise ratios will go down, and the number of resend requests for the same packet will go up). An after-market cable modem as opposed to a cable-company modem will frequently offer a better connection. *Often your connection speed is slow because other programs are using it. To test if other programs are accessing the Internet without your knowing, Click Start, Click Run. Type "cmd" (without quotes). Type "netstat -b 5 > activity.txt". After a minute or so, hold down Ctrl and press C. This has created a file with a list of all programs using your Internet connection. Type activity.txt to open the file and view the program list. Ctrl Alt Delete and open up the Task Manager. Go to the process menu and delete those processes that are stealing your valuable bandwidth. (NOTE: Deleting processes may cause certain programs to not function properly) *Call your ISP and have them verify all of your TCP/IP settings if you are concerned. Ask them to verify that your Proxy settings are correct. * Don't expect dial up or high speed lite service to be fast. The Internet is primarily geared towards Broadband Connections. Sometimes, you have to wait a little. *One thing you have to realize that everything has a limit, including the limited speed of your internet connection provided by your ISP (internet service provider). So we can not surpass the speed that you internet connection originally comes with it, what we can do is reach the fastest speed that is available through your ISP. Dial up (old fashioned) phone internet connection is very slow, Broadband high speed is bit faster and the fastest internet connection is provided by Cable Internet Providers, because they don’t use the phone line. So remember we are only trying to reach the top potential speed that is possible. * Download programs that make browsing faster: - Loband.org is a browser inside of a browser that loads web pages without the images. - Firefox and Opera both have options to disable images. - In Firefox, you can also use extensions such as NoScript that let you block scripts and plug-ins that would otherwise slow things down a lot. - If you are using Internet Explorer or Firefox, try downloading Google Web Accelerator. It is meant to speed up broadband connections, but it can also slow your Internet connection. Try enabling it and disabling it and see when your Internet connection runs faster. - If you are using Firefox, download the Fasterfox extension and Firetune. - Reduce the amount of programs running that use your Internet connection (Instant Messengers, RSS Feeders, and MS Applications set to send Internet data) - Google Accessible Is designed to search pages in order of how clean they are of junk. This will bring up pages that are usually not only easy to read, but are quick to load. * Upgrade your RAM. This will not only improve your regular computer use, but it will affect the speed of your Internet connection because your computer works faster. * Use the Stop button to stop loading pages once you've gotten what you want. * Some times malware on your computer can eat up your bandwidth. Make sure you have an up-to-date malware protection program. * Most Internet Providers have flaky DNS servers (no citation necessary, it's a given) - so, instead of using those provided by your ISP, switch your DNS servers to use those of OpenDNS. OpenDNS is far faster, and more reliable, simply using 208.67.222.222 and 208.67.220.220 as your domain name servers will speed up most flaky DNS problems (may even speed up your networking since OpenDNS has large caches). * Look into running your own local DNS server on your network. Some newer routers may include their own nameserver, otherwise, check into AnalogX.com's DNSCache program, it works great to hold commonly accessed domain names in the "cache" so that the IP addresses do not have to be looked up everytime you navigate to a new page. *Last and most important, if your internet connection has gotten slower all of a sudden it could be due to malicious adware spware virus softwares running in your computer without your knowledge. There are lot of free anti virus scanning softwares on the web but they will charge you to repair or delete these viruses after they find them for you. So it is a better idea to buy a software to start with because the free or trial ones doesn’t work all the way. Also consider installing virus protecting softwares and leave it on all the time. Moreover, the most common reason why internet speed slows down is a trojan attack that is why it is very advisable to have a good anti virus. As I have observed the antivirus programs that our university is using are only demo versions or for home use only. We must put into consideration the positive effects that having a good antivirus may bring to the security of the university's computers and files. I personally suggest these applications:
http://www.eset.com/ http://www.avira.com/en/pages/index.php http://www.kaspersky.com/ | |
| | | Dolorosa G. Mancera
Posts : 40 Points : 43 Join date : 2009-06-23 Age : 34 Location : Lacson Ext. Obrero Davao City
| Subject: Assignment 6 Mon Sep 07, 2009 11:06 pm | |
| It is really a great privilege if ever I would be hired as an IT consultant by the university president. But before I go on answering the question, let me define first what an IT consultant is and what typical work activities of an IT consultant are.
IT Consultant: Job description
• IT consultant works in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organization’s IT systems. • IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development. • They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties.
As what I’ve read, the usual activities that an IT consultant does are the following: meeting with clients to determine requirements, working with clients to define the scope of project, planning timescales and the resources needed, clarifying a client's system specifications, understanding their work practices and the nature of their business, travelling to customer sites, liaising with staff at all levels of a client organization, defining software, hardware and network requirements, analyzing IT requirements within companies and giving independent and objective advice on the use of IT, developing agreed solutions and implementing new systems, presenting solutions in written or oral reports, helping clients with change-management activities, project managing the design and implementation of preferred solutions, purchasing systems where appropriate, designing, testing, installing and monitoring new systems, preparing documentation and presenting progress reports to customers, organizing training for users and other consultants, being involved in sales and support and, where appropriate, maintaining contact with client organizations, identifying potential clients and building and maintaining contacts. Knowing the typical task of an IT consultant is like knowing how tough their duties and responsibilities in an organization or company. Information Technology consulting is a field that focuses on giving advices on businesses on how to use information technology for them to meet their expected objectives for their businesses.
I know everyone of us were familiar with the internet. With the fast evolving technology, almost all of us know the use of the internet not only for business but also in school and for entertainment. A concrete definition of internet that was given by the Wikipedia is that, “internet is a global system of interconnected computer networks that use the standardized Internet Protocol Suite (TCP/IP). It is a network of networks that consists of millions of private and public, academic, business, and government networks of local to global scope that are linked by copper wires, fiber optic cables, wireless connections, and other technologies. The internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support electronic mail, in addition to popular services such as online chat, file transfer and file sharing, online gaming, and Voice over Internet Protocol (VoIP) person-to-person communication via voice and video.”
In the modern human life, internet played a big role for an easier way of communication. Companies, schools and universities and almost all the establishment has an internet connection so , therefore, we need to consider some things in order to make our internet connectivity be improved. Some of those things are the infrastructure, the technologies (be updated with the latest trends of technology) and the innovations. And the different steps and processes in improving your internet connection should also be considered.
Suggestions…
In a university, an internet connection is very important for the student and the faculty as well. But sometimes, we encounter some errors in the connection. Some of the problems that we often encounter are the frustrating low speed of connection and download, the computer is running slow and error in connection .Having a slow internet connection is a frustration for any user while having a fast high-speed internet connection puts the user at an obvious huge advantage. With all the high technology available today, dialup access remains the slowest of all the internet services available. And on the other hand, cable internet tends to be one of the fastest options in internet services. As a student of the said university and as an IT consultant, there are some tips that I could suggest to improve the internet connectivity of our university.
• Improving the Internet connection
The first tip that I guess would help the university is to get rid of the modem that they used and move to ADSL. Broadband is now available at low cost in most areas. “Always-on” cable and ADSL connections are becoming more widely available nowadays. According to what I have read, these will give you a permanent connection and much faster data speed. When using an ADSL you will need a standardBT telephone line, and pay for a connection through an ISP which provides an ADSL “modem” as part of the installation cost. In a university, there are a lot of computers that share a single modem. This can potentially slow down your computer a bit but usually doesn’t make that much of an impact on the speed of your system.
The next tip focuses on the system security. System security is very important for any computer. All the computers should have anti-virus software and firewalls to help with system security. Examples are: Symantec-Norton AntiVirus, McAfee-Antivirus, and Spyware Doctor AntiVirus 6. Some choose to disable their system security programs in order to speed up the computers but this is never a good idea because the computer won’t run fast for long once you get virus or some spyware on the system. So it is really advisable to install antivirus or antispyware on the computers. It would be better if you would back up your critical data regularly. No matter what type of high speed internet connection the university will use, it’s a good thing to remember to install system security to protect the computers and all its personal and confidential information.
My third suggestion is to test the internet speed. One of the most common issues with the slow internet service is that your provider is not giving you what they have said or advertised. The university is paying for it, so you might as well make sure it’s being used as effectively as possible. In checking the connection speed, there are many sites online where you are able to check your connection for free. You can also use the Network Diagnostic Tool to test the network connection speed.
Another suggestion would be, if a computer is not in use turn it off. Since many computers are sharing the same modem and internet connection, when you turn off those unused computers the competition for limited bandwidth resources decreases and it will usually boost those used computers performance. Fifth thing to be aware with is that you have to make sure you are not connected to a weak access point signal. There are many ways that an access point can slow down the connection a bit. For instance, the signal between you and the access point is weak, the access point will automatically downgrade its services to a slower speed and sometimes you may encounter n error in the connection. Always check and know what signal you are connected with.
For downloading, I would advice to use FTP download wherever possible. As what I have read, in downloading files you can often choose between FTP (File Transfer Protocol) and HTML download. But for me, File Transfer Protocol, which uses the well known port called “port 21”, is much faster for file transfer.
References: http://www.high-speed-internet-access-guide.com/articles/increase-modem-speed.html http://www.telecomsadvice.org.uk/infosheets/internet_connectivity2.htm http://www.practicalpc.co.uk/computing/comms/speedup.htm http://www.prospects.ac.uk/p/types_of_job/it_consultant_job_description.jsp) http://netequalizernews.com/2009/03/07/7-tips-and-tricks-to-speed-up-your-internet-connection/ http://en.wikipedia.org/
| |
| | | kate karen rasonable
Posts : 36 Points : 44 Join date : 2009-06-19 Age : 36 Location : Davao City, Philippines
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Sep 09, 2009 2:47 pm | |
| The pace of change brought about by new technologies has had a significant effect on the way people live, work, and play worldwide. New and emerging technologies challenge the traditional process of teaching and learning, and the way education is managed. Information technology, while an important area of study in its own right, is having a major impact across all curriculum areas. Easy worldwide communication provides instant access to a vast array of data, challenging assimilation and assessment skills. Rapid communication, plus increased access to IT in the home, at work, and in educational establishments, could mean that learning becomes a truly lifelong activity—an activity in which the pace of technological change forces constant evaluation of the learning process itself. The vastness of the Internet has something for everyone. We use it to communicate, to play, to work. As the Internet becomes ever-present on college campuses, students are finding more and more ways to use computer technology. Because the Internet has its roots in universities as well as in business, it is not surprising that more and more students are conducting academic research online. The information and data which are available on the net is purely correct and up to date. Internet, a collection of computer networks that operate to common standards and enable the computers and the programs they run to communicate directly provides true and correct information.Availability of internet in college campuses and universities provide a means for students to communicate with the world. This also serves a means for them to learn by browsing the internet in which they are able to broaden their knowledge by gathering information through this medium. Internet is known to be very useful for students nowadays. It can be useful in doing their assignments, researches, thesis writing, project and etc. Internet improves the development of learning of students. Here in University of Southeastern Philippines, students are required to pay the internet fee and yet we are experiencing slow internet connection.If being hired by the university president to take a look on this problem, having slow internet connection, I would suggest the use of following:1) Use of Softwares such as the following:Internet Speed Boosters:webROCKET 2002 is a powerful, easy-to-use program for Windows 95, 98, Me, NT, XP and 2000 which boosts your Internet connection speed by up to 200%. Without webROCKET, Windows lacks the power to provide you with an optimal Internet connection because of changing, unstable network conditions. webROCKET automatically turbo charges the Internet connection by boosting Internet data transport efficiency. webROCKET adapts to modem or high-speed connection to its maximum potential. webROCKET is compatible with any home or office Internet connection that works in Windows, including dial-up modems of any speed, and high-speed connections such as cable modems, DSL, ISDN, T-1, LAN, etc. It works with all Internet services, including AOL and local ISPs.Unlike other Internet boosters, ActiveSpeed actually learns while you surf, boosting your connection faster and faster over time. ActiveSpeed's patent-pending Intelligent Optimization Engine is not available in any other product.Dr. Speed will automatically super charge the Internet connection you already have by optimizing and boosting Internet data transport efficiency. Dr. Speed works with your modem or high-speed connection to bring your surfing speeds and computer efficiency to their maximum potential. Turbo Surfer automatically optimizes your internet connection boosting your internet speed by up to 220%. Works with ALL connections - America Online, Cable Modem, ALL phone modems, even DSL. TweakMASTER is a new generation Internet Optimizer from the leading pioneers in the field. It promotes faster Internet download speeds by carefully and intelligently tweaking various 'hidden' Windows settings. It is ideal for ALL types of Internet connections including dial-up, cable modem, DSL, or satellite. Supports ALL Windows including 95/98/ME/NT4/2000 and Windows XP as well as AOL 5/6/7. Internet Tweak 2002 is a special utility designed to configure and personalize Internet secret settings in Windows Me/98/95/2000/XP. Modem Booster tests the Internet speed settings systematically by running a series of diagnostic tests to see how much room there is for improvement, then determines the best settings for improved performance! Modem Booster uses a revolutionary Ping technology to fine tune modem settings to the exact value customized to ISP for maximum throughput. It tests and optimizes hidden Windows connection settings for Maximum Internet Speed! After this systematic fine-tuning process, Modem Booster tells the expected speed boost in percentage terms, and tunes up the modem settings automatically.Features:Optimize Internet connection performance by accessing Internet Explorer's, Outlook Express's and Netscape Communicators secret settings. In addition, you will get hundreds of selected Internet Tips & Tricks that will boost your browser and e-mail applications performance and productivity. Some of the settings include adjust NDI cache, RcvWindow, Time to Live, MaxMTU, Boost modem transfer speed, Black Hole Detect, change IE & OE's animation logo and title bar, customize location of IE's shell folder and hundreds more.PC AcceleratorswinROCKET provides the system with the most cost-effective performance boost available today.
- MemoKit for Windows 95/98/Me/NT/2000/XP
MemoKit for Windows 95/98/Me/NT/2000/XP- Increase Your Computer Speed by up to 100% - Let your favorite applications use all the memory they need - Prevent Your Computer from Windows crashes- Optimize your Computer Memory, Recover all Memory LeaksRAM AcceleratorsRAMrocket defragments and recovers system's memory. RAMrocket actually boosts the physical memory available to Windows! Easy to install and use.RAMrocket's features:Increases RAM available to Windows and your applications, in just a few seconds!• Defragments your system's memory.• Recovers RAM from Windows and your applications.• Recovers RAM leaks from unstable programs.• Displays real-time graph of available physical and virtual RAM• Lets you run large applications simultaneously without slowing down your system!Easy and powerful for both beginners and experts.Since we all know that the computers in the laboratories have been updated, or were replaced with new ones, still the internet connection slow. Although compared to the previous years where internet connection is described as turtle-paced, today’s connection can be observed to had improved. So upgrading or replacing the computers, I think, still isn’t the best solution for slow internet connection. 2) Proper bandwidth allocation.As far as I know, the university already had increased the bandwidth but still the internet connection is slow. I think this is due to the number of computers sharing for the allocated bandwidth. Even if we keep on increasing the bandwidth, spending so much money, if improper allocation is made, there would still be the problem. However, if proper allocation of bandwidth depending on the usage is made, the problem can somehow be solved. With this, probably, the tasks that require the most bandwidth can be properly attended to and the tasks that require the least bandwidth can also be done at the same time. 3) Virus ProblemsThere are a lot and different kinds of viruses exist in the computer laboratories. Even if each computer has been installed with virus scanner, if not properly used and updated, computers are still prone to different viruses that would later affect the computer as well as the internet connection. Students just insert and open their removable devices without hesitation. So maybe if there is proper maintenance and update of the scanners together with the proper introduction and orientation to students of using the scanners, somehow the problem can be solved.The least that the university can do in having a slow internet connection problem is to orient the users, the faculty and staff as well as the students regarding the proper usage of the computers and internet. References:http://greatware.net/speed/http://wikieducator.org/Need_and_Importance_of_Information_Technology_in_Education | |
| | | Tanya Clarissa G. Amancio
Posts : 44 Points : 51 Join date : 2009-06-21 Age : 34 Location : davao
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Sep 09, 2009 7:55 pm | |
| If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words) -------------------------------------------------------------------------------------------------------------------------------------------If I will be tasked to improve Internet connectivity in a university, here are some steps I would recommend regarding innovation:First I would definitely look into their ISP (Internet Service Provider). Nowadays, we are all aware the about the spread of using internet or known as “Net”. In an organization, in a company, in house, or even in universities and schools internet is one of the most essential part of our daily life. Through internet, you can gain lots of information. You can also earn money. But into us, students, internet is very useful regarding in doing our projects, reports, assignments, BLOGS, etc. Imagine our life without internet? Maybe this time we are still behind of some information, happenings and learning in our community. Internet is really important especially unto us IT students. We really need to gain some knowledge and information through internet. What is internet?The Internet is a global system of interconnected computer networks that use the standardized Internet Protocol Suite (TCP/IP) to serve billions of users worldwide. It is a network of networks that consists of millions of private and public, academic, business, and government networks of local to global scope that are linked by copper wires, fiber-optic cables, wireless connections, and other technologies. The Internet carries a vast array of information resources and services, most notably the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support electronic mail. In addition it supports popular services such as online chat, file transfer and file sharing, gaming, commerce, social networking, publishing, video on demand, and teleconferencing and telecommunications. Voice over Internet Protocol (VoIP) applications allow person-to-person communication via voice and video. By examining and acquiring the best package of internet connection will be the root of your solution in Internet connectivity, I would definitely recommend two different ISP for the university. With this I can easily manage public and private allocation of bandwidth, also this would limit downtime issues since I can easily route all connections from one ISP to the other in case of server issues or heavy traffic occurs. Having two different internet connections with two different ISP is definitely the best solution against downtime issues, you can also tunnel important connections to which ISP is suited or has less traffic at the moment, the university can easily filter accessibility of specific sites through managing each ISP filters. The ISP that the students will be using would have to be heavily filtered, streaming sites both video and audio should not be blocked! Instead bandwidth allocation to this sites will be limited to avoid heavy traffic, yet this measures can be bypassed by using proxy servers outside of the school but having setup the university campus with two different ISP would mean connectivity of the staff and other important connections of the school would not be affected for they will be using the other one. Budget wise, having two different ISP would be expensive upon entry and maintenance would be twice since the university would be spending 2 servers each for the ISP. Secondly I would touch the software and hardware issues in the university. What server do they use? How do clients connect to the internet? What operating system does their client and server use? These are just some question I would ask for me to be able to maximize connectivity to the schools university. Each client especially those of the staff would be using and the students should be setup differently, limitation should be focused on system units that would be used by students and staff while units under the Library should be lightly restricted. Now on the hardware aspect, the school should set everything up to wireless networking, this would eliminate cables issues and wireless connection is easy to maintain compare to cables. But using a wireless network would add another issue on to who are allowed to use the internet. Security also plays a big role to connectivity, this is how clients connect to your network and be allowed to access the internet, now having setup your security authentication will not matter, what matter most is for the admin to know who are connecting, so I would recommend to have each users register his unit MAC address to be added in the server for it to have access of the school Wi-Fi connection. Setting up a password authentication on a wireless network is not recommended, what is important is that each connections should have its identity and this can only be done by acquiring its MAC address, this would add plenty of work load to the admin but this would also ensure the campus internet bandwidth to be secure. InnovationThe term innovation refers to a new way of doing something. Innovation is a key factor in long-term organisational success. While there is a short-term need to generate new products and services to meet shareholder expectations, innovation is also critical to long-term survival. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy. Innovation is an important topic in the study of economics, business, design, technology, sociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth. Those who are directly responsible for application of the innovation are often called pioneers in their field, whether they are individuals or organizations. In the organizational context, innovation may be linked to performance and growth through improvements in efficiency, productivity, quality, competitive positioning, market share, etc. All organizations can innovate, including for example hospitals, universities, and local governments. While innovation typically adds value, innovation may also have a negative or destructive effect as new developments clear away or change old organizational forms and practices. Organizations that do not innovate effectively may be destroyed by those that do. Hence innovation typically involves risk. A key challenge in innovation is maintaining a balance between process and product innovations where process innovations tend to involve a business model which may develop shareholder satisfaction through improved efficiencies while product innovations develop customer support however at the risk of costly R&D that can erode shareholder return. In summary, innovation can be described as the result of some amount of time and effort into researching (R) an idea, plus some larger amount of time and effort into developing (D) this idea, plus some very large amount of time and effort into commercializing (C) this idea into a market place with customers. (Reference needed) TechnologyWhat is technology? Technology is a broad concept that deals with human as well as other animal species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. It can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or wooden spoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software and business methods, fall under this definition of technology. In technology, I would recommend high end computer devices that performs fast and accurate. It could help browsing the internet with greater speed. When we are browsing in the internet we can see that there are many websites who are currently using special effects like animations, add-ons, flash, videos and lots of picture that requiring a bigger video card requirements. So if we try to browse using our computers in lab and in nodal, there are possibilities of buffering or sometimes if you dont have a choice, you need to restart your unit. A high speed internet connection such as broadband will allow you to collaborate more closely with your suppliers through sharing plans, forecasts and consumer data. Sharing such information with your suppliers makes it easier for you to: analyse real-time information about sales, orders or market trends forecast and react quickly to changes in demand improve efficiency - accurate information on stock means you will only order the supplies you need InfrastructureWhat is infrastructure? According to mr. Wikipedia, Infrastructure can be defined as the basic physical and organizational structures needed for the operation of a society or enterprise, or the services and facilities necessary for an economy to function. The development and maintenance of essential public infrastructure is an important ingredient for sustained economic growth and poverty reduction. Poor infrastructure is perhaps the most binding constraint to growth throughout the University. Maintenance I think it is one of the best way in improving the connectivity of the internet. Here are some tips to improve the performance of our computers.Performing maintenance will increase the performance of the computer, and thus Beyond TV. The following three basic PC maintenance steps are highly recommended. 1.Viruses, worms, et. Al. can cause all manner of problems from system crashes to loss of data. Professional programs to protect PC's against viruses are available, such as Network Associates' McAfee package, and Symantec's Norton Antivirus software. Some anti-virus software is free. Visit your favorite download site for a sampling. Be sure to scan for and remove malware detected by the scanner of your choice. 2.Spyware / Adware is installed on your PC without your consent or knowledge through Web sites, or is installed with other software packages. These programs run in the background, often sending information about your PC or yourself over the Internet. In doing so, they slow a computer down, occupying the CPU and the Internet connection. Two very popular spyware detection and removal programs are free to personal users -- Patrick M. Kolla's SpyBot Search & Destroy and Lavasoft's Ad-Aware SE. Be sure to scan for and remove spyware detected by the scanner of your choice. 3.Hard Disk fragmentation is not an easy phenomenon to explain, but the symptom it causes is a slowing, often drastic, of operations involving data transfer to and from your computer's hard disk drive. All modern versions of the Windows operating system include a disk defragmenter. In Windows XP, for example, it may be accessed via START BUTTON --> ALL PROGRAMS --> ACCESSORIES --> SYSTEM TOOLS --> Disk Defragmenter. Be sure to run a complete hard disk defragmentation prior to running Beyond TV. MORE TIPS:The modem. The first tip is to get rid of the modem and move to ADSL. Yes, broadband is available at low cost in most areas. Visit keyword broadband to see if it's supported by your exchange. If you are on broadband you can probably skip the rest of our top tips because you'll be enjoying life rather than fretting about your connection speed. Upgrade Drivers. Driver files are updated regularly by most modem manufacturers. For some modems, you can also "flash upgrade" the software in the modem to provide the latest (and fastest) communications software. Even so, you should be sure the driver is right for your operating system. To find the latest drivers, just enter the modem details into a search engine such as aolsearch.aol.co.uk/ or www.google.co.uk with the word "driver". So, to find drivers for a USR Sportster modem, enter "USR Sportster driver" and follow any instructions on how to install it. You can check your current modem drivers from Control Panel. With Windows XP, Select Start | Control Panel | Phone and Modem Options | Modems | Properties | Drivers. Surf when the yanks are in their PJs - the internet is much faster if the rest of the world is asleep. Try surfing while eating your bowl of breakfast corn flakes. It's much more bracing! Tweak your settings. Your PC has some settings that may improve modem throughput. All data sent over the internet goes in data packets. The size of these packets is the Maximum Transmission Unit or MTU. The aim is to send packets that are as large as possible without them needing to be broken down into smaller packets which would slow down your connection speed. A modem user, should set the (MTU) to 1500, the RWIN multiplier to 10 and Time to Live to 35. Download Tweak-Me or Tweak-XP to get this done for automatically you. Use FTP download wherever possible. If you want to download files, you can often choose between FTP or HTML download. FTP, (File Transfer Protocol,) is much faster for file transfers so you should choose that when you can. Use a high speed port. This'll only apply to readers with really old computers. The serial port may use an old, slow chip called a UART. The answer is to fit a high speed serial port with a 16550 UART chip or to fit an internal modem which includes one of these beasties. Use a download tool. There's nothing more frustrating than losing a connection near the end of a one hour download. The good news is that most downloads are resumable which means they can be restarted from where the connection failed. You need the right tool to manage the reconnection - one of the best is shareware Getright. Getright also searches for the fastest download sites and splits the download between several sites with the downloads running in parallel for the maximum possible download speed. Use a faster browser. Once you've connected to AOL, you can start any browser and run it in a second window. Opera is one of the fastest so why not download a free copy and give it a test run? Manage your cache. Every time you use the internet, images and other files are downloaded onto your hard disk. If a particular image or other file is needed in a subsequent session, it can be pulled from the disk faster than it could be downloaded again. They are kept in a "Temporary Internet Files" folder, often called Cache. When the folder is full, Windows deletes the oldest files. You can vary the size of this folder by visiting Control Panel and selecting Internet Options. If you increase it, then more files can be stored on hard disk but if you go too far, then a slow PC may spend too long searching cache! You'll need to experiment to find the right level for your PC and internet connection speed. Define a blank homepage. Each time you start a browser outside your AOL window, the browser will go to the defined homepage. If this is slow, you should change the home page to a fast web site. If you are a real speed nut, set it to blank. To do this, go to Internet Options as above, and select Use Blank. Now your external browser is up and running in record time. Don't display images. Text only windows are much faster to download. You can easily restore images when required. Here's how to set whether to display images: From Internet Options (see above) select the Advanced tab. Scroll down until you see the multimedia section, then select or deselect "Show pictures." Select Apply then OK to save your change. References: http://internetspeedmonitorpro.com/testing-and-improving-your-internet-speed/ http://en.wikipedia.org/wiki/ http://www.educause.edu/EDUCAUSE+Review/EDUCAUSEReviewMagazineVolume40/ImprovingInstitutionalPerforma/158023 MY BLOG[list][*]
Last edited by Tanya Clarissa G. Amancio on Sat Oct 03, 2009 5:30 am; edited 2 times in total | |
| | | leah_saavedra
Posts : 39 Points : 40 Join date : 2009-06-20 Age : 35
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Sun Sep 20, 2009 11:31 pm | |
| Type of connection:
Broadband Internet access, often shortened to just broadband, is a high data rate Internet access—typically contrasted with dial-up access using a 56k modem. Dial-up modems are limited to a bitrate of less than 56 kbit/s (kilobits per second) and require the full use of a telephone line—whereas broadband technologies supply more than double this rate and generally without disrupting telephone use. Although various minimum bandwidths have been used in definitions of broadband, ranging up from 64 kbit/s up to 2.0 Mbit/s, the 2006 OECD report is typical by defining broadband as having download data transfer rates equal to or faster than 256 kbit/s, while the United States (US) Federal Communications Commission (FCC) as of 2009, defines "Basic Broadband" as data transmission speeds exceeding 768 kilobits per second (Kbps), or 768,000 bits per second, in at least one direction: downstream (from the Internet to the user’s computer) or upstream (from the user’s computer to the Internet). The trend is to raise the threshold of the broadband definition as the marketplace rolls out faster services. Data rates are defined in terms of maximum download because several common consumer broadband technologies such as ADSL are "asymmetric"—supporting much slower maximum upload data rate than download. Broadband is often called "high-speed" Internet, because it usually has a high rate of data transmission. In general, any connection to the customer of 256 kbit/s (0.256 Mbit/s) or greater is more concisely considered broadband Internet. The International Telecommunication Union Standardization Sector (ITU-T) recommendation I.113 has defined broadband as a transmission capacity that is faster than primary rate ISDN, at 1.5 to 2 Mbit/s. The FCC definition of broadband is 768 kbit/s (0.8 Mbit/s). The Organization for Economic Co-operation and Development (OECD) has defined broadband as 256 kbit/s in at least one direction and this bit rate is the most common baseline that is marketed as "broadband" around the world. There is no specific bitrate defined by the industry, however, and "broadband" can mean lower-bitrate transmission methods. Some Internet Service Providers (ISPs) use this to their advantage in marketing lower-bitrate connections as broadband.
In practice, the advertised bandwidth is not always reliably available to the customer; ISPs often allow a greater number of subscribers than their backbone connection or neighborhood access network can handle, under the assumption that most users will not be using their full connection capacity very frequently. This aggregation strategy works more often than not, so users can typically burst to their full bandwidth most of the time; however, peer-to-peer (P2P) file sharing systems, often requiring extended durations of high bandwidth, stress these assumptions, and can cause major problems for ISPs who have excessively overbooked their capacity. For more on this topic, see traffic shaping. As takeup for these introductory products increases, telcos are starting to offer higher bit rate services. For existing connections, this most of the time simply involves reconfiguring the existing equipment at each end of the connection. As the bandwidth delivered to end users increases, the market expects that video on demand services streamed over the Internet will become more popular, though at the present time such services generally require specialized networks. The data rates on most broadband services still do not suffice to provide good quality video, as MPEG-2 video requires about 6 Mbit/s for good results. Adequate video for some purposes becomes possible at lower data rates, with rates of 768 kbit/s and 384 kbit/s used for some video conferencing applications, and rates as low as 100 kbit/s used for videophones using H.264/MPEG-4 AVC. The MPEG-4 format delivers high-quality video at 2 Mbit/s, at the low end of cable modem and ADSL performance. In telecommunications and signal processing, baseband is an adjective that describes signals and systems whose range of frequencies is measured from zero to a maximum bandwidth or highest signal frequency; it is sometimes used as a noun for a band of frequencies starting at zero. It can often be considered as synonym to lowpass, and antonym to passband, bandpass or radio frequency (RF) signal. A signal at baseband is often used to modulate a higher frequency carrier wave in order that it may be transmitted via radio. Modulation results in shifting the signal up to much higher frequencies (radio frequencies, or RF) than it originally spanned. A key consequence of the usual double-sideband amplitude modulation (AM) is that, usually, the range of frequencies the signal spans (its spectral bandwidth) is doubled. Thus, the RF bandwidth of a signal (measured from the lowest frequency as opposed to 0 Hz) is usually twice its baseband bandwidth. Steps may be taken to reduce this effect, such as single-sideband modulation; the highest frequency of such signals greatly exceeds the baseband bandwidth. Some signals can be treated as baseband or not, depending on the situation. For example, a switched analog connection in the telephone network has energy below 300 Hz and above 3400 Hz removed by bandpass filtering; since the signal has no energy very close to zero frequency, it may not be considered a baseband signal, but in the telephone systems frequency-division multiplexing hierarchy, it is usually treated as a baseband signal, by comparison with the modulated signals used for long-distance transmission. The 300 Hz lower band edge in this case is treated as "near zero", being a small fraction of the upper band edge.
Baseband vs. Broadband Data signals can be sent over a network cable in one of two ways: broadband or baseband. One good example of broadband signaling would be how you view different channels through your cable box and a signal coaxial cable carrying multiple signals in cable television.
Whereas, baseband signaling only sends a single signal over the cable. This type of signaling is typically used in Ethernet networks, with the exception of 10Broad3 standard (rarely used). Baseband uses very simple transceiver devices that send and receive signals on a cable. The simplicity behind baseband signaling is that only three states need to be distinquished: one, zero and idle. Broadband transceivers are much more complex because they must be able to distinquish those same states, but on multiple channels within the same cable. Because of its simplicity, baseband signaling is used on most Ethernet networks.
Type of topology: Network topology is the physical interconnections of the elements (links, nodes, etc.) of a computer network. A local area network (LAN) is one example of a network that exhibits both a physical topology and a logical topology. Any given node in the LAN has one or more links to one or more other nodes in the network and the mapping of these links and nodes in a graph results in a geometrical shape that may be used to describe the physical topology of the network. Likewise, the mapping of the data flows between the nodes in the network determines the logical topology of the network. The physical and logical topologies may or may not be identical in any particular network. Any particular network topology is determined only by the graphical mapping of the configuration of physical and/or logical connections between nodes. The study of network topology uses graph theory. Distances between nodes, physical interconnections, transmission rates, and/or signal types may differ in two networks and yet their topologies may be identical.
There are also three basic categories of network topologies: • physical topologies • signal topologies • logical topologies
The terms signal topology and logical topology are often used interchangeably, though there is a subtle difference between the two.
Physical topologies The mapping of the nodes of a network and the physical connections between them – i.e., the layout of wiring, cables, the locations of nodes, and the interconnections between the nodes and the cabling or wiring system.
Classification of physical topologies
Point-to-point The simplest topology is a permanent link between two endpoints (the line in the illustration above). Switched point-to-point topologies are the basic model of conventional telephony. The value of a permanent point-to-point network is the value of guaranteed, or nearly so, communications between the two endpoints. The value of an on-demand point-to-point connection is proportional to the number of potential pairs of subscribers, and has been expressed as Metcalfe's Law.
Permanent (dedicated) Easiest to understand, of the variations of point-to-point topology, is a point-to-point communications channel that appears, to the user, to be permanently associated with the two endpoints. Children's "tin-can telephone" is one example, with a microphone to a single public address speaker is another. These are examples of physical dedicated channels. Within many switched telecommunications systems, it is possible to establish a permanent circuit. One example might be a telephone in the lobby of a public building, which is programmed to ring only the number of a telephone dispatcher. "Nailing down" a switched connection saves the cost of running a physical circuit between the two points. The resources in such a connection can be released when no longer needed, for example, a television circuit from a parade route back to the studio.
Switched: Using circuit-switching or packet-switching technologies, a point-to-point circuit can be set up dynamically, and dropped when no longer needed. This is the basic mode of conventional telephony.
Bus network topology In local area networks where bus technology is used, each machine is connected to a single cable. Each computer or server is connected to the single bus cable through some kind of connector. A terminator is required at each end of the bus cable to prevent the signal from bouncing back and forth on the bus cable. A signal from the source travels in both directions to all machines connected on the bus cable until it finds the MAC address or IP address on the network that is the intended recipient. If the machine address does not match the intended address for the data, the machine ignores the data. Alternatively, if the data does match the machine address, the data is accepted. Since the bus topology consists of only one wire, it is rather inexpensive to implement when compared to other topologies. However, the low cost of implementing the technology is offset by the high cost of managing the network. Additionally, since only one cable is utilized, it can be the single point of failure. If the network cable breaks, the entire network will be down, since there is only one cable. Since there is one cable, the transfer speeds between the computers on the network is faster.
Linear bus The type of network topology in which all of the nodes of the network are connected to a common transmission medium which has exactly two endpoints (this is the 'bus', which is also commonly referred to as the backbone, or trunk) – all data that is transmitted between nodes in the network is transmitted over this common transmission medium and is able to be received by all nodes in the network virtually simultaneously (disregarding propagation delays). Note: The two endpoints of the common transmission medium are normally terminated with a device called a terminator that exhibits the characteristic impedance of the transmission medium and which dissipates or absorbs the energy that remains in the signal to prevent the signal from being reflected or propagated back onto the transmission medium in the opposite direction, which would cause interference with and degradation of the signals on the transmission medium (See Electrical termination). Distributed bus The type of network topology in which all of the nodes of the network are connected to a common transmission medium which has more than two endpoints that are created by adding branches to the main section of the transmission medium – the physical distributed bus topology functions in exactly the same fashion as the physical linear bus topology (i.e., all nodes share a common transmission medium).
Notes: 1.) All of the endpoints of the common transmission medium are normally terminated with a device called a 'terminator' (see the note under linear bus). 2.) The physical linear bus topology is sometimes considered to be a special case of the physical distributed bus topology – i.e., a distributed bus with no branching segments. 3.) The physical distributed bus topology is sometimes incorrectly referred to as a physical tree topology – however, although the physical distributed bus topology resembles the physical tree topology, it differs from the physical tree topology in that there is no central node to which any other nodes are connected, since this hierarchical functionality is replaced by the common bus.
Star network topology In local area networks where the star topology is used, each machine is connected to a central hub. In contrast to the bus topology, the star topology allows each machine on the network to have a point to point connection to the central hub. All of the traffic which transverses the network passes through the central hub. The hub acts as a signal booster or repeater which in turn allows the signal to travel greater distances. As a result of each machine connecting directly to the hub, the star topology is considered the easiest topology to design and implement. An advantage of the star topology is the simplicity of adding other machines. The primary disadvantage of the star topology is the hub is a single point of failure. If the hub were to fail the entire network would fail as a result of the hub being connected to every machine on the network.
Notes: 1.) A point-to-point link (described above) is sometimes categorized as a special instance of the physical star topology – therefore, the simplest type of network that is based upon the physical star topology would consist of one node with a single point-to-point link to a second node, the choice of which node is the 'hub' and which node is the 'spoke' being arbitrary. 2.) After the special case of the point-to-point link, as in note 1.) above, the next simplest type of network that is based upon the physical star topology would consist of one central node – the 'hub' – with two separate point-to-point links to two peripheral nodes – the 'spokes'. 3.) Although most networks that are based upon the physical star topology are commonly implemented using a special device such as a hub or switch as the central node (i.e., the 'hub' of the star), it is also possible to implement a network that is based upon the physical star topology using a computer or even a simple common connection point as the 'hub' or central node – however, since many illustrations of the physical star network topology depict the central node as one of these special devices, some confusion is possible, since this practice may lead to the misconception that a physical star network requires the central node to be one of these special devices, which is not true because a simple network consisting of three computers connected as in note 2.) above also has the topology of the physical star. 4.) Star networks may also be described as either broadcast multi-access or nonbroadcast multi-access (NBMA), depending on whether the technology of the network either automatically propagates a signal at the hub to all spokes, or only addresses individual spokes with each communication.
Extended star A type of network topology in which a network that is based upon the physical star topology has one or more repeaters between the central node (the 'hub' of the star) and the peripheral or 'spoke' nodes, the repeaters being used to extend the maximum transmission distance of the point-to-point links between the central node and the peripheral nodes beyond that which is supported by the transmitter power of the central node or beyond that which is supported by the standard upon which the physical layer of the physical star network is based.
Note: If the repeaters in a network that is based upon the physical extended star topology are replaced with hubs or switches, then a hybrid network topology is created that is referred to as a physical hierarchical star topology, although some texts make no distinction between the two topologies.
Distributed Star A type of network topology that is composed of individual networks that are based upon the physical star topology connected together in a linear fashion – i.e., 'daisy-chained' – with no central or top level connection point (e.g., two or more 'stacked' hubs, along with their associated star connected nodes or 'spokes').
Ring network topology In local area networks where the ring topology is used, each computer is connected to the network in a closed loop or ring. Each machine or computer has a unique address that is used for identification purposes. The signal passes through each machine or computer connected to the ring in one direction. Ring topologies typically utilize a token passing scheme, used to control access to the network. By utilizing this scheme, only one machine can transmit on the network at a time. The machines or computers connected to the ring act as signal boosters or repeaters which strengthen the signals that transverse the network. The primary disadvantage of ring topology is the failure of one machine will cause the entire network to fail.
Mesh The value of fully meshed networks is proportional to the exponent of the number of subscribers, assuming that communicating groups of any two endpoints, up to and including all the endpoints, is approximated by Reed's Law.
Fully connected mesh topology The type of network topology in which each of the nodes of the network is connected to each of the other nodes in the network with a point-to-point link – this makes it possible for data to be simultaneously transmitted from any single node to all of the other nodes. Note: The physical fully connected mesh topology is generally too costly and complex for practical networks, although the topology is used when there are only a small number of nodes to be interconnected.
Partially connected mesh topology The type of network topology in which some of the nodes of the network are connected to more than one other node in the network with a point-to-point link – this makes it possible to take advantage of some of the redundancy that is provided by a physical fully connected mesh topology without the expense and complexity required for a connection between every node in the network.
Tree network topology Also known as a hierarchical network. The type of network topology in which a central 'root' node (the top level of the hierarchy) is connected to one or more other nodes that are one level lower in the hierarchy (i.e., the second level) with a point-to-point link between each of the second level nodes and the top level central 'root' node, while each of the second level nodes that are connected to the top level central 'root' node will also have one or more other nodes that are one level lower in the hierarchy (i.e., the third level) connected to it, also with a point-to-point link, the top level central 'root' node being the only node that has no other node above it in the hierarchy (The hierarchy of the tree is symmetrical.) Each node in the network having a specific fixed number, of nodes connected to it at the next lower level in the hierarchy, the number, being referred to as the 'branching factor' of the hierarchical tree.
The list above shows what are the appropriate type of connection needed in our school and what kind of topology is best. As to what kind of mediums are better, i suggest we use low cost cables like coax or STP and UTP. Fiber optics are good but considering the financial status of our school, the three types above are more suited. http://en.wikipedia.org/wiki/Network_topology http://en.wikipedia.org/wiki/Baseband http://en.wikipedia.org/wiki/Broadband
Last edited by leah_saavedra on Tue Sep 22, 2009 12:57 am; edited 3 times in total | |
| | | karl philip abregana
Posts : 29 Points : 37 Join date : 2009-06-22
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Mon Sep 21, 2009 12:41 am | |
| 1. Boost computer speed It is important to use high-speed computers in order for our connection to be fast. But since the university lack the budget to buy or even just upgrade our computers, it is not practical for us to implement this. 2. Use software that speeds up any internet connection for example: WEBROCKET webROCKET is a powerful, easy-to-use program for Windows 95, 98, Me, NT, 2000, and XP which accelerates your Internet connection speed by up to 200%. What does webROCKET do? Without webROCKET, Windows lacks the power to provide you with an optimal Internet connection because of changing, unstable network conditions. webROCKET automatically turbo charges your Internet connection by boosting Internet data transport efficiency. webROCKET adapts your modem or high-speed connection to its maximum potential. webROCKET is compatible with any home or office Internet connection that works in Windows , including dial-up modems of any speed, and high-speed connections such as cable modems, DSL, ISDN, T-1, LAN, etc. It works with all Internet services, including AOL and local ISPs. The disadvantage of using accelerators is that the quality of the interface will be decreased. Not a bad option for us since our concern is only to speed up internet connection. But if i were to ask, i would not sacrifice quality for speed. 3. Use repeaters or routers Network repeaters regenerate incoming electrical, wireless or optical signals. With physical media like Ethernet or Wi-Fi, data transmissions can only span a limited distance before the quality of the signal degrades. Repeaters attempt to preserve signal integrity and extend the distance over which data can safely travel. Thus, if CAS to ENG'G building or to other buildings exceeds the max distance, we can use repeaters to regenerate the signal. Routers are specialized computers that send your messages and those of every other Internet user speeding to their destinations along thousands of pathways. So if we use routers, best path will evidently accelerate internet speed. Our school is currently using routers but internet connection was still very low. 4. Careful selection and planning of medium to be use. Since we are already using fiber-optic cables and UTP cables (if I'm right), we should maintain and maximize the use of these mediums in order for the data to travel faster. 5. Increase in bandwidth We've already increased our bandwidth but still our connection hasn't worked out well yet. So i think it is not the solution to our problem because increase bandwidth will cost us much. 6. Proper usage Ask we've discuss in our MIS subject, we've learned that no matter how fast or advance the technology your using, it will be useless if the users doesn't use their minds... What i mean is, these are only tools, and we are the users of these tools. As I searched the web, here are some tips on how to improve internet connectivity: Broadband Connection Speed The only thing better than a fast broadband Internet connection, is a faster broadband Internet connection. Broadband Internet speed tests allow you to measure your current broadband speed against that of faster broadband Internet connections. There are various programs and software packages that you can purchase through which you can increase the speed of your Internet connection. You can also make adjustments to the hardware components (Upgrade processor speed and memory levels) of your system maximizing your computer’s broadband connection potential. If you’re not looking to purchase additional software / hardware add-ons, there are manual “tweaks” that you can make to your system through which you can boost your broadband speed. Increasing Your Broadband Internet Speed Let’s assume that you access the Internet via a broadband LAN line. The following are 3 examples of ways through which you can manipulate your network settings and increase your broadband speed: Reduce your network latency by increasing the request buffer size Tests of LAN broadband connections have shown that delays can be caused as a result of the default request buffer size setting of 4356 decimal. As it is, it has been proven that an increase to a 16384 decimal setting can allow for better performance. (Such an increase is only possible if you have the necessary memory) By utilizing this slight “tweak,” you can noticeably increase your Internet speed and broadband networking capabilities. Altering your network task scheduler If you’ve encountered long waits when trying to open network folders, then this “tweak” is for you. One of the default settings with broadband networking is that when you open a network folder, your system performs a test of the networked computer in order to search for scheduled tests. By disabling this option for a LAN connection, you can increase your broadband networking speed. Increasing your network transfer rate Transfer rate, also referred to as throughput, refers to the speed at which data can be moved from 1 location to another. Network redirector buffers serve the purpose of optimizing your disk performance, and therefore allowing for the fastest possible broadband networking speed. If you increase the number of network redirector buffers functioning on your system, you could greatly increase your throughput. An Internet speed test following this change will yield noticeable results. Internet Speed Tests If you are looking to perform an Internet speed test for your system, there are various tools online through which you can provide your ISP, your area code, your connection type, etc., and receive a reading of your broadband speed compared to the top providers in your area. This allows for you to realize if your broadband speed is lacking in comparison to others, and work to maximize your broadband networking potential. This can be achieved through implementation of the above tweaks or through hardware upgrades and software purchases. How to Increase Internet Connection Speed Instructions Things You'll Need: • Computer • Internet connection • Web browser • ISP telephone number 1. Step 1 Find out from your Internet Service Provider (ISP) what Internet connection speed your paying for. Make sure the speed your paying for is the speed programmed in their network. 2. Step 2 Speedtest.net Test your Internet connection speed. You can do this by going to one of these speed test websites: Speakeasy.net/speedtest or Speedtest.net. Record your results. 3. Step 3 Compare the speeds from step one and step two. If your getting the speed your paying for go no further. If your not, go to the next step. 4. Step 4 Manage Add-ons screen Disable web-browser Add-ons that can slow down your Internet connection speed. Check to see if you have multiple web browser Add-ons operating with your browser. For example, if your web browser is Internet Explorer , go to Tools, select Manage Add-ons, and look at what Add-ons are enabled. Disable the ones you do not want to use. 5. Step 5 Run anti-virus, adware, spyware, and malware scans. All of these, if found on your computer, could negatively affect your Internet connection speed. 6. Step 6 Run Disk Cleanup and Disk Defragmenter from your System Tools menu. 7. Step 7 SG TCP Optimizer Download TCP Optimizer software to optimize your computers MTU (Maximum Transmission Unit) values, RWIN (Receive Window) values, and broadband related registry keys. The most popular and FREE TCP Optimizer that I found is called "SG TCP Optimizer". You can download it at CNET: http://www.download.com/SG-TCP-Optimizer/3000-2155_4-10488572.html?tag=lst-1 or at PCWORLD: http://www.pcworld.com/downloads/file/fid,68524-order,1-page,1/description.html. 8. Step 8 Speakeasy.net/speedtest/ Retest your Internet connection speed by going to one of these speed test websites: Speakeasy.net/speedtest or Speedtest.net. Record and compare these results with the results obtained from steps one and two. Step 1: Modifying your LAN Properties. a) Go to Start>Network Connections. b) Right Click>Properties on your main Internet connection. c) Make the following changes: [Unselect everything except "Internet Protocol (TCP/IP)" Step 2: Running Lvllords Patch. In Win XP SP 1, TCP connections were set to unlimited. However, Microsoft limited these connections to 10 in SP2. You can open more TCP connections and give your speed an all time high by following these steps: a) Extract the patch and run it: b) Press "C" to change the limit and set it to 100. c) Press "Y". After you do that, you'll be prompted with a Windows XP message saying that your original files are being replaced and blah blah. Thats normal so DONT PANIC. Click "Ignore" or "Cancel" to that window, and press "Yes" after it asks for a confirmation. d) You'll get a message that your patch was successfully executed. Exit and reboot your computer. Don't forget to Bookmark this page (Hit CTRL+D) so you can return once you've rebooted. 3) SG TCP Optimizer - using it! Back already? Cool! So, this is where you use SpeedGuide's TCP Optimizer. How? Its your lucky day, my friend. a) This program modifies your system registry (Nothing to worry about!) in order to boost your Internet speed. Here's what you have to do: b) Choose your connection speed - mine is 256 kbps. Go down and select "Optimal Settings". Click on Apply changes after that. You'll be promoted with a box somewhat like this after that: c) Click OK, and click "Yes" to reboot your computer. Thats all you need to have a full-on High Speed Internet Connection! Enjoy browsing, and don't forget to check out other tweaks & tricks present on this site. (Hint: Bookmark this page in order to make an easy comeback). | |
| | | ailaine adaptar
Posts : 50 Points : 57 Join date : 2009-06-19 Age : 104
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Tue Sep 22, 2009 11:31 am | |
| If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words)***********************************
IT consultant work review:
Today IT consulting has become a major opportunity for many IT professionals who want to work for themselves. It is no longer only the domain of the high-flying international organization. In fact, tens of thousands of IT professionals are leaving their regular jobs to set up as IT consultants on their own. Although there are many consulting opportunities available, it is quite a challenge to make a success of your own IT consulting business.
My stand: If ever I were hired by the university president as an IT consultant, I would recommend the establishment and maintenance of security requirements necessary to protect university information, computing and network resources, and minimize susceptibility to attacks on USEP resources against other sites.
My first suggestion would be the request for a university’s local Department of Information and Network Technology (DINTech) or whatever name the university prefer. It would be an entity that is responsible for the planning, coordinating, implementing, regulating, and administrating the information and network technology. It will promote, develop, and regulate integrated and strategic information and communication technology systems and reliable and cost-efficient communication facilities and services. During this entire article, we will be able to realize the importance of this department I would say. This department must have its own office for security and integrity purposes.
Secondly, I would like to strengthen the security requirements. Security shall be in place for the protection of the privacy of information, protection against unauthorized modification of information, protection of systems against the denial of service, and protection of systems against unauthorized access. Users are reminded that all usage of USEP's information technology resources is subject to all University policies.
I would suggest that university information, computing, and network resources may be accessed or used only by individuals authorized by the university. The university encourages the use of computing and network resources and respects the privacy of users. Nonetheless, the university may access information stored on the University's network of computers for the purposes listed below. I got these lists from an article so as an IT consultant (in this case), I point out the type of users who can do such privileges.
a. Troubleshooting hardware and software problems- This must be limited to network administrator and computer technicians. b. Preventing unauthorized access and system misuse- There are some cases in our school that students encounter a problem with the computer they are using. And so, to solve the problem, they modify the properties of the hardware and software in our virtual libraries by themselves that causes another worse problem. c. Retrieving University business related information- This must be limited to the officer-in-charge of the business. d. Investigating reports of violation of university policy- There must be a special force that must look (in-charge) this area. This may include the illegal distribution of software from the resource of the university. This also includes the unauthorized charging of notebooks (laptops) and any electronic devices in the university outlet, and etc. e. Complying with legal requests for information f. Rerouting or disposing of undeliverable mail- This must be the look-out of the network administrator. This is an important issue that the net-admin must consider.
g. Addressing safety or security issues
To the greatest extent possible in a public setting individuals' privacy should be preserved. However, privacy or confidentiality of documents and messages stored on University-owned equipment cannot be guaranteed. Users of electronic mail systems should be aware that, in addition to being subject to authorized access, electronic mail in its present form cannot be secured and is, therefore, vulnerable to unauthorized access and modification by third parties.
Systems that are found to pose a threat to the integrity of the information, computing and network resources may have their access to these resources be suspended. The suspension of services will continue until the problem has been remedied and the system validated by Department of Information and Network Technology (this is the department that I want our university to have..at least) for operation within the USEP information, computing and network resources environment. The University reserves the right to invoke emergency suspension of services without prior notification if the situation poses a serious threat to the information technology environment.
Persons in violation of this policy are subject to the full range of sanctions, including the loss of computer or network access privileges with or without notification, disciplinary action, dismissal from the University, and legal action. Some violations may constitute criminal offenses, as outlined in USEP statutes and other local, state, and federal laws; the University will carry out its responsibility to report such violations to the appropriate authorities.
I would suggest that the university must recheck its system and analyze its weakness and strength. The university must check if it meets the minimum requirement that a good internet and information system must possess. To guide the university doing this, I have here below an excerpt from an article in the internet that would help.
Initial Network Hook-up: Each system must be capable of passing a test for vulnerabilities to hacker attacks and relaying of unsolicited email prior to being attached to USEP’s information, computing and network resources. System testing will be the responsibility of the Departmental/Unit or University Security Officer. [ in our case it would be the Department of Information and Network Technology]
Password Specification: Password Policy: All passwords on any system, whether owned by USEP or by an individual, directly connected to University network must adhere to the following standards when technically possible. This includes devices connected to the campus network with a direct wired connection, wireless, dial-in modem, remote access software (e.g., Windows Remote Desktop), use of a Virtual Private Network (VPN), and the like. This policy applies to all passwords - eID, system, user, database, application, etc. Any system that does not comply may have its network access blocked without prior notification
Password Standards: a. Passwords must have a minimum of 7 characters. b. Passwords must contain characters from 3 of the 4 following categories: i. Uppercase letters ii. Lowercase letters iii. Numbers iv. Special Characters (for example: !,@,#,$,%,^,&,*, etc. But be aware if traveling outside the U.S. that some symbols, like the U.S. dollar sign, may not be available on international keyboards) c. Passwords cannot be the same as the USEP eID and not easily guessed (for example: no variants of the USEP eID, dictionary words, family names, pet names, birthdates, etc.). d. Passwords must be changed at least twice a year (eID password changes are during a designated time at the beginning of the fall and spring semesters). e. Passwords must be changed significantly and cannot repeat more frequently than every two years. f. Passwords that are written down or stored electronically must not be accessible to anyone other than the owner and/or issuing authority. g. The same password used to access Kansas State University Systems (for example, your eID password) must not be used for accounts or other forms of access to non-USEP systems or applications such as online shopping, banking, etc. h. Passwords must not be shared unless explicitly permitted by the issuing authority. eID passwords must not be shared under any circumstances. i. Anyone who believes their password has been compromised must immediately notify their departmental or college IT support, or the IT Help Desk to evaluate possible risks. j. Default passwords in vendor-supplied hardware or software must be changed during initial installation or setup. k. The eID password must never be transmitted over the network in clear text (i.e., it must always be encrypted in transit). It is also strongly recommended that other types of passwords be encrypted in transit.
Unattended Computers To protect against unauthorized access to data on computers left unattended, the following precautions are required: a. Enable password protection on the screen saver for all university computers with the exception of special-purpose computers designed for public access, such as information or registration kiosks, public computers in the library, or computer labs where locking is undesirable due to the risk of a user monopolizing a shared computer. The length of time before the password-protected screen saver comes on should be set to 20 minutes or less. For lab situations, it is recommended that computers be set to automatically logout after at the most 30 minutes of idle time. b. Never leave your computer unattended and unprotected. Before leaving your computer, lock the display or log out in a manner that requires a password to gain access. Protection from Malicious Software and Intrusions: Malicious software, or "malware", comes in many forms - viruses, worms, Trojan horses, denial of service attacks, botnets, spyware, adware, spam relays, etc. All pose a security risk, some of which are a very serious threat to the confidentiality, integrity, or availability of USEP's information and technology resources. Appropriate precautions must be taken to protect USEP systems and information from compromise by malware. To that end, USEP may require the installation of essential security software on computers connected to the USEP campus network or accessing USEP information and technology resources. The following sections define specific requirements for antivirus, spyware/adware, personal firewalls, and e-mail. Assuring the validity of malware protection software is the responsibility of each user, the department/unit security representative, and the USEP Security Officer.
Virus Protection a. The following computers must use the university-supplied antivirus software configured in a managed mode ("managed mode" allows a server to monitor and configure the antivirus protection on the client computer and push updates to the client on demand): i. Any university-owned computer ii. Student-owned computers in USEP residence halls iii. Users of USEP's wireless or wired network if it is a university-owned computer or one that belongs to a current USEP faculty, staff, or student. b. All other computers accessing the USEP campus network or information technology resources must be running active, up-to-date virus protection software. Current USEP faculty, staff, and students may run the university-supplied antivirus software on their home computers at no cost to meet this requirement. c. Antivirus software must be activated when the computer boots up and remain active at all times during its operation. d. Real-time file scanning must be enabled where files are scanned for malicious anomalies before they are written to the hard drive. e. The version of the antivirus software (i.e., the antivirus program or engine) must be no more than one version behind the current version offered by the vendor or the version endorsed by USEP, and must be supported by the vendor. f. f. Virus definition files (i.e., the database in the antivirus software that identifies known malware) must be up-to-date with the most current version available from the vendor. g. Checking for and installing updates to virus definition files and antivirus software must be automated and performed at least daily. h. Comprehensive virus scans of all local hard drives must be performed at least weekly. Spyware/Adware Protection a. All computers connected to the campus network must run active spyware/adware protection software. b. Spyware/adware definition/detection rules must be up-to-date with the most current version available from the vendor. c. Scans of all local hard drives for spyware/adware must be performed at least weekly. Personal Firewall Protection a. All computers using the university-supplied security software (which includes virus, spyware, intrusion, and firewall protection) must have the firewall enabled. b. Any other computer connected to the campus network must run a personal firewall. Microsoft Windows Firewall is an acceptable personal firewall. E-mail Protection a. All campus e-mail servers must provide antivirus protection that detects and mitigates infected e-mail messages. b. Infected messages must be discarded or quarantined, not returned to the sender.
Security Patches All systems connected to the campus network and the applications and databases running on those systems must have the latest security patches available from the respective vendors applied. Any system or application with known vulnerabilities for which a patch is not available must take appropriate measures to mitigate the risk, such as placing the system behind a firewall. Kansas State University may block access to the network for systems that have not been patched.
College/Departmental Systems Colleges, departments, or other USEP units may institute their own distributed computing system, as these provide valuable specialized services to users. These servers, in order to protect the University resources to which they are connected, must be kept no more than one version behind the current vendor-supported version of the operating system and application software and comply with all security requirements and standards set forth in this policy. Campus units with qualified IT support staff may run their own security management environment with the university-supplied security software that provides virus, spyware, intrusion, and firewall protection. The unit security management system must be configured to provide reports to the central security management system to facilitate comprehensive campus-wide reporting. In the absence of qualified IT support staff, units must use the central security management services for malware protection. Assurance of server protection is the responsibility of the Department of Information and Network Technology.
Enforcement Enforcement of these policies and associated standards is the responsibility of the Department of Information and Network Technology or designee. Any system that does not comply with security policies and standards, is susceptible to a known vulnerability, or is compromised may have its network access blocked immediately and without prior notification to protect the integrity of other systems and data.
Any device directly connected to the campus network (i.e., with a direct wired or wireless connection, dial-in modem, remote access software like Windows Remote Desktop, use of a Virtual Private Network (VPN), and the like) may be scanned and assessed by designated DINTech or security staff at any time to determine compliance with security policies and standards, or detect anomalous activities, vulnerabilities, and security compromises. Firewalls must be configured to permit this remote scanning function. Scanning may only be performed to the extent necessary to detect and assess the risk.
USEP must have a defined procedures for restoring network access after the vulnerable or compromised system has been repaired The Chief IT Security Officer will determine whether the repair will require the computer to be reformatted and the operating system and all software and data re-installed, depending on the nature of the compromise.
Security Personnel Responsibilities: University IT Security Officer: The University employee who leads the IT security program to protect USEP's information, computing, and network resources. Responsibilities include assisting with university-wide IT security policies, controls and procedures; developing and maintaining security architecture, standards, and guidelines; monitoring compliance with IT security policies and standards; risk assessment; coordinating responses to security incidents; communication with organizations outside the University; chairing the Security Incident Response Team; and promoting training and awareness of the secure use of information, computing and network resources. IT Security Analyst: Technical personnel in central information technology units assigned with responsibility for the secure operation of information, computing and network security at the enterprise level. Responsibilities include monitoring the state of information, computing and network security; detection and remediation of security incidents, implementation of preventative measures, configuration and management of security technology (for example, firewalls and intrusion detection systems), and communication of alerts and remedies to departmental/unit security representatives.
Security Incident Response Team (SIRT): A team with representatives from each academic college and major administrative unit that provides advisory, proactive, and reactive support for USEP's IT security program. Responsibilities include coordinating the campus-wide response to major security incidents; coordinating implementation of preventative measures in their colleges/units; communicating threats and best practices to their colleges/units; approving requests for restoring network access to vulnerable or compromised computers; participating in the development of IT security policies, standards, guidelines, and procedures; and assisting with IT security training and awareness efforts. SIRT duties should constitute no more than 30% of an individual's job responsibilities.
Departmental Security Representatives: The primary point of contact for departments for IT security matters. The departmental security representative serves as a liaison between SIRT and the department by assisting with communication, facilitating implementation of preventative measures in the department, and coordinating the response to security incidents involving technology or data within the department.
Deans and Department Heads: Responsibilities include authorizing access to computer systems in their units, ensuring that System Users understand and agree to comply with University and unit security policies, and ensuring that the technical and procedural means and resources are in place to assist in maintaining the security policies and procedures outlined above.
System Users: Responsibilities include agreeing to and complying with all applicable University and unit security policies and procedures; taking appropriate precautions to prevent unauthorized use of their accounts, software programs, and computers; protecting university data from unauthorized access, alteration, or destruction; representing themselves truthfully in all forms of electronic communication; and respecting the privacy of electronic communication.
Appropriate use of information technology resources includes instruction; independent study; authorized research; independent research; and official work of the offices, units, recognized student and campus organizations, and agencies of the University.
Authorized users are: (1) faculty, staff, and students of the University; (2) anyone connecting from a public information service; (3) others whose access furthers the mission of the University and whose usage does not interfere with other users' access to resources. In addition, a user must be specifically authorized to use a particular computing or network resource by the campus unit responsible for operating the resource.
It is my responsibility as a consultant to be aware of the potential for and possible effects of manipulating information, especially in electronic form, to understand the changeable nature of electronically stored information, and to continuously verify the integrity and completeness of information that you compile or use. I am responsible for the security and integrity of University information stored on your individual computing desktop system.
Appropriate use of information technology resources includes instruction; independent study; authorized research; independent research; and official work of the offices, units, recognized student and campus organizations, and agencies of the University.
All of my proposals and suggestions above are sill useless if it will not be put into operation. But most of all, this must be considered : appropriate use of information technology resources includes instruction; independent study; authorized research; independent research; and official work of the offices, units, recognized student and campus organizations, and agencies of the University. Everybody, from the top to the lowest must be aware.
Reference: http://www.elsevier.com/wps/find/bookdescription.cws_home/679850/description#description http://www.wikipedia.com
_aiai_
Last edited by ailaine adaptar on Sat Oct 17, 2009 1:27 pm; edited 2 times in total (Reason for editing : for the formatting) | |
| | | Venus Millena
Posts : 41 Points : 41 Join date : 2009-06-23 Age : 34 Location : lacson ext., brgy. obrero, davao city
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Sep 30, 2009 1:26 pm | |
| The university is provide a not conducive Internet connection. The connection is very slow, unsecured from computer viruses and limited features and website to browse. Transactions are blocked prohibiting students view video files and listening musics because of the high memory space it consumed. I understand that the government cutting-off the budget of the university, but we are students who have a very wild mind and full of ideas that someway fruitful for self-improvement for maturity. We need a wide concern area to explore in order for us to discover things that can be learned by our own actual practice and involvement. We have to spread our wings but how can do this if many things are prohibited, how can we learn independently, if the university limit the students ground for exploration on cyberspace.
1. Traffic in Internet Connection Management
Traffic in Internet connectivity is one of the problems the university. The network equipment at the hosting company will cycle through each person downloading the file and transfer a small portion at a time so each person's file transfer can take place, but the transfer for everyone downloading the file will be slower. If 100 people all came to the site and downloaded at the same time, the transfers would be extremely slow. If the host wanted to decrease the time it took to download files simultaneously, it could increase the bandwidth of their Internet connection (at a cost due to upgrading equipment). The greater the bandwidth, the greater it cost. I know that enhancing bandwidth is not be easy be implemented it because it is very costly and the university cannot afford to finance it. But if we are looking for better and fast Internet connection we have to sacrifice if you are so willing to do. If it really cannot suffice to purchase high bandwidth for then do bandwidth management. Bandwidth management is the process of measuring and controlling the communications (traffic, packets) on a network link, to avoid filling the link to capacity or overfilling the link, which would result in network congestion and poor performance.
Identity-based Bandwidth Management:
Bandwidth Management, establish priorities based on users, web category, groups and applications with precise bandwidth allocation based on usage and time of the day. The Internet content filtering module complements bandwidth management by blocking access to high bandwidth-consuming audio-video download, gaming, tickers, ads and more. This ensures that business and bandwidth-critical applications like CRM, VoIP and more gain guaranteed bandwidth. Enterprises can fine-tune their bandwidth policies based on changing user requirements as well as their usage for continually improved network performance. Bandwidth Prioritization enables bandwidth policies to be created to enable bandwidth allocation to high-priority business traffic, enabling enterprises to deliver uninterrupted access to business critical applications and users. At the same time, they retain bandwidth control over recreational traffic and heavy bandwidth guzzling media applications. Committed and Burstable Bandwidth Enterprises can create bandwidth policies to allocate guaranteed bandwidth to users, assigning minimum and maximum bandwidth to users. Committed bandwidth ensures that critical users receive constant levels of bandwidth during peak and non-peak traffic periods. Burstable bandwidth allocation allows users to receive greater bandwidth when available, ensuring optimal usage of the resource. Time-based Bandwidth Allocation Bandwidth Management, enterprises can schedule and regulate bandwidth as per user requirements. High bandwidth can be provided to a user during a particular time of the day when uninterrupted access is required. By doing this, enterprises can lower the peaks in bandwidth usage across the day. This limits the need for bandwidth purchase based on excessively high peaks, controlling operating expenses.
Reference: www.wikipedia.com
2.Better Internet Security
When a computer connects to a network and begins communicating with others, it is taking a risk. Internet security involves the protection of a computer's internet account and files from intrusion of an unknown user. Basic security measures involve protection by well selected passwords, change of file permissions and back up of computer's data. Security concerns are in some ways peripheral to normal business working, but serve to highlight just how important it is that business users feel confident when using IT systems. Security will probably always be high on the IT agenda simply because cyber criminals know that a successful attack is very profitable. This means they will always strive to find new ways to circumvent IT security, and users will consequently need to be continually vigilant. Whenever decisions need to be made about how to enhance a system, security will need to be held uppermost among its requirements. IT Security is the most important need of every organization. Especially effective Internet Security has become an essential need for every small, medium or large enterprises using information technology and other internet based services to perform their work easily and effectively. The organization's dependency over Internet has increased the need for internet security implementation and network monitoring inside the organization. All companies from private and public sector, non-government organization, educational institutes and financial institutions are dependent on Internet for information exchange. Internet is also a major way of instant communication between two channels. Therefore, chances of information leak, hacking or intrusion are more than earlier days due to increasing dependency on internet.
Security Vulnerability with the Internet connection or Intranet can result following major security threats:
1. Unauthorized access of servers and systems in the network, 2. Unauthorized access of Internet connection for illegal or criminal purposes 3. Stealing, alteration or deletion of sensitive systems and data 4. Denial-of-service attacks, resulting in an inability by users to access systems 5. Viruses or Trojans attack on systems, Virus infections in important data 6. Destruction of websites and online systems Above described threats are just glimpse of security threats caused by weak Internet Security Mechanism. Information is an asset that, like other important business assets, has value to an organization and consequently needs to be suitably protected. Failure in implementation of proper internet security mechanism can ultimately have worse effect. An organization with no or less effective internet security policy can have following ill effects:
1. Deterioration of organization's overall reputation 2. Reduced public confidence in the agency’s online services 3. Unauthorized disclosure of company's secret information 4. Financial loss through online fraud 5. Financial loss by reducing productive work hours due to intrusion To secure workplace from potential internet threats, an organization has to adopt proper internet security policy, utilize best available security tools, and practice strict monitoring measures (manual and automated both) inside office premises. With proper planning, technical expertise and continuous efforts an organization can restrict most of the external threats related to Internet Security.
Reference: www.wikipedia.com
3.Having an VoIP- (Voice over Internet Protocol) is simply the transmission of voice traffic over IP-based networks.
A VoIP Gateway is a network devic which helps to convert voice and fax calls, in real time, between an IP network and Public Switched Telephone Network (PSTN). It is a high performance gateway designed for Voice over IP applications. Typically, a VoIP gateway comes with the ability to support at least two T1/E1 digital channels. Most VoIP gateways feature at least one Ethernet and telephone port. Controlling a gateway can be done with the help of the various protocols like MGCP, SIP or LTP.
Benefits of VoIP Gateways
The main advantage of VoIP gateway is that it can provide connection with your existing telephone and fax machines through the traditional telephone networks, PBXs, and key systems. This makes the process of making calls over the IP network familiar to VoIP customers. VoIP gateways can end a call from the telephone and can provide user admission control using IVR (Interactive Voice Response) system and provide accounting records for the call. Gateways also help direct outbound calls to a specific destination, or can end the call from another gateway and send the call to the PSTN. VoIP gateways plays a major role in enhancing carrier services and also supports the simplicity of the telephone calls for less cost and easy access. Flexible call integration has been developed at less cost which enables programmable call progress tones and distinctive ring tones. Functions of VoIP Gateways The main functions of VoIP gateways include voice and fax compression or decompression, control signaling, call routing, and packetization. VoIP gateways are also power packed with additional features such as interfaces to external controllers like Gatekeepers or Softswitches, network management systems, and billing systems. Future of VoIP Gateway Technology Over the years, VoIP gateway has become an efficient and flexible solution and is used for office data and voice connectivity. Besides the connectivity performance, VoIP also offers better reliability under a variety of circumstances. The future of VoIP gateway is very clear and precise; high-density, scaleable, open platforms need to be designed and implemented to allow the millions of installed telephones and fast-growing number of H.323 computer clients (such as Netscape's Communicator and Microsoft's NetMeeting) to communicate over IP. Many vendors are in the process of designing interoperable VoIP gateways according to the latest architectures to meet the changing demands of service providers, corporate network clients, and individual carriers.
How does VoIP work?
VoIP or Voice Over Internet Protocol (sometimes called Internet Telephony) is touted in some circles as the technology of future. The reasoning is simple, really. VoIP is bringing possibilities to the forefront of technological thinking because the possibilities were listed as impossible just a few years ago. VoIP uses a broadband Internet connection for routing telephone calls, as opposed to conventional switching and fiberoptic alternatives. This process holds great promise in providing higher efficiency and lower cost for communication consumers. One interesting aspect of the technology is that, for the user, no large scale infrastructure is required. It's all about combining the functionality of the Internet and a conventional phone into one single service with minimal software and hardware support.
Refererence: www.tech-faq.com/voip-gateway.shtml#
4.Using Open Source Software
Open Source's proponents often claim that it offers significant benefits when compared to typical commercial products. Commercial products typically favour visible features (giving marketing advantage) over harder-to measure qualities such as stability, security and similar less glamorous attributes. As a shorthand, we shall describe this phenomenon as quality vs features. Open Source Software developers are evidently motivated by many factors but favouring features over quality is not noticeable amongst them. For many developers, peer review and acclaim is important, so it's likely that they will prefer to build software that is admired by their peers. Highly prized factors are clean design, reliability and maintainability, with adherence to standards and shared community values preeminent. "The Open Source community attracts very bright, very motivated developers, who although frequently unpaid, are often very disciplined. In addition, these developers are not part of corporate cultures where the best route to large salaries is to move into management, hence some Open Source developers are amongst the most experienced in the industry. In addition all users of Open Source products have access to the source code and debugging tools, and hence often suggest both bug fixes and enhancements as actual changes to the source code. Consequently the quality of software produced by the Open Source community sometimes exceeds that produced by purely commercial organisations." (QINETIQ2001).
1. Reliability Reliability is a loose term. Broadly, we can take it to mean the absence of defects which cause incorrect operation, data loss or sudden failures, perhaps what many people would mean when they use the term `bug'. Strictly, a bug would also mean failure to meet the specification, but since most Open Source projects dispense with the concept of anything easily recognisable as a formal specification, it's hard to point to that as good way of defining what is a bug and what is a feature. Determining what constitutes a bug is usually by agreement amongst the developers and users of the software (an overlapping community in many cases). Obvious failure to perform is easily recognised as a bug, as is failure to conform to appropriate published standards. Security related failings (exploits or vulnerabilities) are clearly bugs too. Each of these kinds of bugs is usually addressed with speedy fixes wherever possible and Open Source advocates will claim very rapid time-to-fix characteristics for software. The pattern with closed-source software is typically that a defect report needs to be filed and then there will be a delay before the vendor determines when or whether to issue an updated release. Users of the software are much more at the mercy of the vendor's internal processes than with the Open Source arrangement and the personal experience of the authors is that it can be extremely frustrating to move from the Open Source to the closed model. "The market greatly values robustness, and the Open Source model, particularly as practiced by Linux, encourages a large market of early adopters (compared to the size of the early market for commercial products) who actively help debug the software. Consequently much Open Source software becomes highly robust at a surprisingly early stage of its development, and mature Open Source products are setting new industry standards for bulletproofness." (QINETIQ2001)
2.Stability In a business environment software is mostly a necessary evil, a tool to do a job. Unless the job changes or more efficient processes are discovered then there is rarely pressure or need to alter the software that is being used to assist the task. This is more or less directly counter to what motivates software vendors who are in the unenviable position of supplying a commodity that does not wear out or age much. The vendors need a stable revenue stream to be able to keep their business going whilst their customers have not the slightest desire to change or upgrade any product that is working well enough to suit their needs. If a software supplier can establish a virtual monopoly and then force upgrades onto its audience (as has been the history of the software industry since the mid 1960s) then the profits can be very high. Software vendors can apply a number of tactics to persuade their customers to upgrade more or less willingly. Typical tactics include moving to allegedly new and improved file formats (which require the new and improved software to read them) or to withdraw support and bug fixes for older versions after a short period. The problem for users of the software is that they rarely have much control over that process and are left isolated if they choose to remain with older versions that they consider to be acceptable. This has cost and control implications for the business. Open Source Software is not a panacea in the world of ever-changing software, but the worst effects of vendor-push can be mitigated. The way that Open Source products tend to conform closely to standards efforts has an inertial effect, since standards change but slowly and interchange formats are often particularly stable. As a result, incompatible file formats can be less of an issue. If they are standards-based then they typically aren't an issue at all, and if they are formats unique to the software product — proprietary formats in a sense - then they cannot be undocumented since the source code that uses them is itself published. In the real world, no business is static and software changes to meet new requirements. A choice to use Open Source software can provide a counter to the pressures to upgrade for the vendor's commercial purposes but cannot shelter every user from any change. Having access to the source code can allow a business to choose to support itself on an old version where necessary and we belive that in general it gives more options and choice to the users. Nonetheless, some upgrading and maintenance effort will always be needed. Putting the choice in the hands of the users rather than the suppliers is hard to criticize. Auditability A rarely-understood benefit of Open Source software (any software where the source code is published) is its auditability. Closed-source software forces its users to trust the vendor when claims are made for qualities such as security, freedom from backdoors, adherence to standards and flexibility in the face of future changes. If the source code is not available those claims remain simply claims. By publishing the source code, authors make it possible for users of the software to have confidence that there is a basis for those claims. Whether this takes the form of an cursory and informal inspection or more rigorous auditing, what's clear is that without access to the source, third party inspection is impossible. At present the industry does not insist on third party inspection or certification, but it's possible that as open source models become more popular then expectations of audits will rise. CONECTA2000 notes: "We can easily see that open source software has a distinct advantage over proprietary systems, since it is possible to easily and quickly identify potential security problems and correct them. Volunteers have created mailing lists and auditing groups to check for security issues in several important networking programs and operating system kernels, and now the security of open source software can be considered equal or better than that of desktop operating systems. It has also already been shown that the traditional approach of security through obscurity leaves too many open holes. Even now that the Internet reaches just a part of the world, viruses and cracker attacks can pose a significant privacy and monetary threat. This threat is one of the causes of the adoption of open source software by many network-oriented software systems."
3.Cost Most current Open Source projects are also available free of royalties and fees, leading to the confusion around the commonly used term `free software'. Regrettably the English language does not have separate concepts for free-of-charge and free as in unconstrained; other languages are better equipped to describe the difference between `freedom' and `free of charge' (libre vs. gratis). Proponents of free software licences tend to emphasise liberty over cost although in practice the main open source projects are free in both senses of the word. From a business perspective the purchase cost of software is only one factor; total cost of ownership (TCO) is what really matters. Other things being equal, the solution with lowest TCO is usually the most desirable one. Arguments in favour of low TCO for open source software include:
1.Possibly zero purchase price 2.Potentially no need to account for copies in use, reducing administrative overhead 3.Claimed reduced need for regular upgrades (giving lower/nil upgrade fees, lower management costs) 4.Claimed longer uptimes and reduced need for expensive systems administrators 5.Near-zero vulnerability to viruses eliminating need for virus checking, data loss and downtime 6.Claimed lower vulnerability to security breaches and hack attacks reducing systems administration load 7.Claimed ability to prolong life of older hardware while retaining performance
Open Source projects have very little motivation to attempt this kind of lock-in strategy. Since there is no commercial benefit to be had, adherence to de-jure or de-facto standards (where they exist) is typically high. Where standards for interworking do not exist, the fact that the source code is published means that proprietary data formats can't be used to manipulate lock-in. This at least partly explains the relative success of Open Source software in infrastructure areas. Many vendors have tried to create web servers to compete with Apache, but because the network protocol used between browsers and the web server is well specified they have had to compete on quality or features rather than through more insidious tactics. Any vendor that controlled the lions' share of the browser and the server market would feel strongly tempted to exclude competitors by proprietary extensions to the HTTP protocol if they thought they could get away with it. No single vendor has yet managed to control both ends of this equation to a great enough degree. "Open Source software tends to be free of dependency on related products. Purchasers often perceive that the product works best with other products from the same manufacturer. Open Source software offers its users greater freedom to purchase other products, avoiding lock-in to particular manufacturers." (QINETIQ2001).
Open Source software provides further flexibility through freedom.
1. Freedom from a single vendor Software vendors can go out of business, and they can arbitrarily decide to cease development of a product. How would your business cope if it relied on such a product? Open-source software allows you to retain not just the right to use the software you already have, but the ability to continue to use it as your needs change.
2. Freedom to modify your software You aren't limited to what one company believes you need. Proprietary software vendors must cater for many different companies, predominantly their own. Open-source software can be tailored for the way you do business. It is usually within the resources of all but the smallest companies to modify Open Source software to suit their own needs (and potentially then to make those enhancements available as a public good). If in-house development skills don't exist, a short email to the project's mailing list will usually find a suitable consultant.
Reference: www.tech-faq.com/voip-gateway.shtml#
5.Free wi-fi (wireless fidelity) connection in university area of responsibility Wi-Fi allows connectivity in peer-to-peer (wireless ad hoc network) mode, which enables devices to connect directly with each other. This connectivity mode can prove useful in consumer electronics and gaming applications. Many consumer devices use Wi-Fi. Amongst others, personal computers can a network to each other and connect to the Internet, mobile computers can connect to the Internet from any Wi-Fi hotspot, and digital cameras can transfer images wirelessly. Router which incorporate a DSL-modem or a cable-modem and a Wi-Fi access point, often set up in homes and other premises, provide Internet-access and internetworking to all devices connected (wirelessly or by cable) to them. One can also connect Wi-Fi devices in ad hoc mode for client-to-client connections without a router. Wi-Fi also enables places which would traditionally not have network to be connected. In business environments, just like other environments, increasing the number of Wi-Fi access-points provides redundancy, support for fast roaming and increased overall network-capacity by using more channels or by defining smaller cells. Wi-Fi enables wireless voice-applications (VoWLAN or WVOIP). Over the years, Wi-Fi implementations have moved toward "thin" access-points, with more of the network intelligence housed in a centralized network appliance, relegating individual access-points to the role of mere "dumb" radios. Outdoor applications may utilize true mesh topologies. As of 2007 Wi-Fi installations can provide a secure computer networking gateway, firewall, DHCP server, intrusion detection system, and other functions.
Operational advantages Wi-Fi allows local area networks (LANs) to be deployed without wires for client devices, typically reducing the costs of network deployment and expansion. Spaces where cables cannot be run, such as outdoor areas and historical buildings, can host wireless LANs. Wireless network adapters are now built into most laptops. The price of chipsets for Wi-Fi continues to drop, making it an economical networking option included in even more devices. Wi-Fi has become widespread in corporate infrastructures. Wi-Fi is widely available in more than 220,000 public hotspots and tens of millions of homes and corporate and university campuses worldwide. The current version of Wi-Fi Protected Access encryption (WPA2) is not easily defeated, provided strong passwords are used. New protocols for Quality of Service (WMM) make Wi-Fi more suitable for latency-sensitive applications (such as voice and video), and power saving mechanisms (WMM Power Save) improve battery opera.
WIFI FEATURES AND BENEFITS:
1. Wifi Features 2. High Speed Internet Access 3. Site Surveys 4. Enterprise Grade Equipment 5. Scalable Systems 6. VPN Compatability 7. Flexible Authentication Methods 8. Experienced Installation Technicians 9. Maintenance Agreements 10.Free and Pay Services 11. 24 x 7 x 365 End User Technical Support 12. 24 x 7 x 365 Equipment Monitoring 13. Support and Usage Reports via your personal login through the Hospitality WiFi Control Panel
Wifi Benefits:
1. Guest Satisfaction 2. Convenient End User Mobility 3. Increased Foot traffic 4. Increased Revenue 5. Easy Access 6. Unlimited Flexible Billing Options 7. Complete Turn Key Solution
References: www.wikipedia.com www.hospitalitywifi.com/featuresbenefits.htm
6.Insourcing
Insourcing is a developing or making a system by assigned personnels of the company that responsible and knowledgeable on system development. If the university's programmers and IT instructors that capable of developing and making the Information System will do the job it will be very nice, it is like making use of own resources. In short, love your own. It is cheaper than proprietary or outsourcing. It also a challenge to the instructors their ability in their field and do their best to encourage students to do actual practices. It is actual presentation of what an IT professional and programmers will be in the future.
This is all I can proposed if will be hired as IT consultant of the university.
Last edited by Venus Millena on Sat Oct 03, 2009 4:36 am; edited 4 times in total | |
| | | shane sacramento
Posts : 58 Points : 60 Join date : 2009-06-19 Age : 32 Location : davao city
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Wed Sep 30, 2009 9:04 pm | |
| Internet Internet connects people all over the world, through this transfer of information has been very easy and fast. Not like before the Internet was introduced, people tend to give more effort transporting data and information manually, which will take more time and even pushes you to spend more money.
The concept and foundation for creation of the Internet was somehow developed by three individuals and a research conference, each of which changed the way we thought about technology by accurately predicting its future:
Vannevar Bush wrote the first visionary description of the potential uses for information technology with his description of the "memex" automated library system. Norbert Wiener invented the field of Cybernetics, inspiring future researchers to focus on the use of technology to extend human capabilities. The 1956 Dartmouth Artificial Intelligence conference crystallized the concept that technology was improving at an exponential rate, and provided the first serious consideration of the consequences. Marshall McLuhan made the idea of a global village interconnected by an electronic nervous system part of our popular culture.
In 1957, the Soviet Union launched the first satellite, Sputnik I, triggering US President Dwight Eisenhower to create the ARPA agency to regain the technological lead in the arms race. ARPA appointed J.C.R. Licklider to head the new IPTO organization with a mandate to further the research of the SAGE program and help protect the US against a space-based nuclear attack. Licklider evangelized within the IPTO about the potential benefits of a country-wide communications network, influencing his successors to hire Lawrence Roberts to implement his vision.
Roberts led development of the network, based on the new idea of packet switching discovered by Paul Baran at RAND, and a few years later by Donald Davies at the UK National Physical Laboratory. A special computer called an Interface Message Processor was developed to realize the design, and the ARPANET went live in early October, 1969. The first communications were between Leonard Kleinrock's research center at the University of California at Los Angeles, and Douglas Engelbart's center at the Stanford Research Institute.
The first networking protocol used on the ARPANET was the Network Control Program. In 1983, it was replaced with the TCP/IP protocol developed by Robert Kahn, Vinton Cerf, and others, which quickly became the most widely used network protocol in the world.
In 1990, the ARPANET was retired and transferred to the NSFNET. The NSFNET was soon connected to the CSNET, which linked Universities around North America, and then to the EUnet, which connected research facilities in Europe. Thanks in part to the NSF's enlightened management, and fueled by the popularity of the web, the use of the Internet exploded after 1990, causing the US Government to transfer management to independent organizations starting in 1995. http://www.livinginternet.com/i/ii_summary.htm
Infrastructure description
The overall responsibility for managing Internet Protocol address or domain names at upper levels is vested in the Internet Assigned Numbers Authority (IANA), which delegates the actual administration of most functions to other bodies.
At global regional levels, the principal bodies providing allocation and registration services that support the operation of the Internet globally are:
RIPE NCC (Réseaux IP Européens Network Coordination Centre) ARIN (American Registry for Internet Numbers) APNIC (Asia Pacific Network Information Centre) LACNIC (Latin American and Caribbean IP address Regional Registry) AfriNIC (African Regional Registry for Internet Number Resources)
Internet Operations Internet operations are coordinated worldwide through the Internet Engineering Planning Group (IEPG), an Internet operational group intended to assist Internet Service Providers to interoperate within the Global Internet. At global regional levels, bodies active in coordinating operations include the: American Registry for Internet Numbers
Manages the Internet numbering resources for North America, a portion of the Caribbean, and sub-equatorial Africa. Asia Pacific Networking Group (APOPs) Promotes the Internet and the coordination of network inter-connectivity in the Asia Pacific Region. Internet Security
Internet network security is significantly facilitated by a number of Computer Emergency Response Teams (CERTs) in eight countries and within a number of service provider operations and private networks. They were formed to continually monitor the network for security incidents, serve as a repository for information about such incidents, and develop responsive advisories. The CERTs are coordinated by the Forum of Incident Response and Security Teams.
Internationalisation
Alis Technologies Founded in 1981, Alis Technologies Inc. develops standards for Multilingual Information Management Solutions (MIMS )with the IETF (Internet Engeneering Task Force), the Unicode Consortium, W3C and LISA. Internationalized Domain Names (IDN) Committee Working group to study making domain names available in character sets other than ASCII. MINC (Multilingual Internet Names Consortium) A non-profit, non-governmental, international organization. It focuses on the promotion of multilingualisation of Internet names, including Internet domain names and keywords, internationalization of Internet names standards and protocols, technical coordination and liaison with other international bodies. CNNIC (China Network Information Center) (site is in Chinese) KRNIC (Korea Network Information Center) Established the system for managing Internet address resources in Korea. TWNIC (Taiwan Network Infomation Center) The unique neutral and non-profit organization that takes charge of the domain name registration and IP address allocation in Taiwan. The Unicode Consortium Responsible for defining the behavior and relationships between Unicode characters, and providing technical information to implementers. The Consortium cooperates with ISO in refining the specification and expanding the character set. Internet Connectivity Network Startup Resource Center (NSRC) Database about international networking developments and Internet connectivity providers, with major emphasis on countries in Asia, Africa, Latin America and the Caribbean, the Middle East, and Oceania. Information is available on country-by-country basis and includes connectivity providers, networking infrastructure, and other country-specific information.
Connectivity Table from University of Wisconsin's FTP server. Lists entities with and without international network connectivity. Shows countries (with ISO two letter country code (ISO 3166)) which have: international IP Internet links domestic UUCP sites which are connected to the Global Multiprotocol Open Internet, and domestic FIDONET sites which are connected to the Global Multiprotocol Open Internet. Connectivity Maps Internet Hosts Map Shows the millions of Internet hosts worldwide as of January 1999. http://www.isoc.org/internet/infrastructure/
As the Internet grows in popularity (and usefulness?), choosing the right connection for accessing it is becoming a very important decision at many nonprofits. While modems still provide many smaller nonprofits with their crawling connection, DSL, cable, ISDN, and T1 are introducing many organizations to the joys of high-speed access. Speed
When you read an ad that says: "Our Connection Provides You With an Amazing and Unbelievable 10 Megs Per Second Download Speed," you should be amazed, but maybe not as much as you think. Some time back, a quick marketing person decided that Internet connection speed would be better described in bits, not the bytes usually used to describe disk size and RAM space. So that "unbelievable" 10 MegaBITS per second is actually about 1.25 megaBYTES per second. This article, not to confuse things further, will continue to discuss Internet speed in bits per second. So, don't jump out of your skin the next time you read about another "unbelievable" download speed. Simply divide by 8. Types of Access: Dial-Up vs. "Always On"
Anyone who has used a modem knows the problems associated with dial-up access. Even if you just want to check your email for one minute, you have to wait a couple minutes for your modem to dial a number and establish a connection to your ISP. It often takes less time to check your email than it does to connect the Internet! While this isn't a major problem if you rarely use the Internet, it can be a major annoyance if you use it heavily. For heavy users, a dedicated, "always on" connection such as DSL or T1 is the better alternative. Not only does such a connection provide access "on demand," it is also faster and easier to share with a large group of users. Dial-Up Connection - 56K
Bottom line -- modem speeds have pretty much hit the speed limit with 56K modems. In fact, 56K is a little misleading. Due to FCC regulations, the maximum transmission is more around 53K. If your organization needs a faster connection, you will have to go with a digital connection (i.e. xDSL, ISDN, etc.). However, if you find the right ISP, and your nonprofit doesn't have high access demands, modems may more than fulfill your needs. http://www.techsoup.org/learningcenter/connections/archives/page10224.cfm
As said above, it is better to use DSL than Dial-up and it is a good thing that the university's Internet connection was DSL.
Technology Internet Rocket Homepageware 5.0 Internet Rocket is a simple tool for speeding up an Internet connection. Use the easy to follow wizard to optimize the registry for your Internet connection. DNS Rocket will cache server information on hard drive to speed the Internet even more.
InternetVelocity 1.5 InternetVelocity is a powerful utility that dramatically increases the speed that you surf the web by configuring key TCP/IP parameters, filtering Pop-up and Banner Ads, and prefetching pages before you need them! Surf two, or three times faster!
Accelerweb 2.0 Welcome to Accelerweb! Accelerweb is the latest in simple Internet acceleration technology. Accelerweb caches DNS settings to the hosts file. Accelerweb will store IP address/domain information in a local cache called the "Hosts" file.
http://fast-internet.softplatz.net/
Infrastracture In information technology and on the Internet, infrastructure is the physical hardware used to interconnect computers and users. Infrastructure includes the transmission media, including telephone lines, cable television lines, and satellites and antennas, and also the routers, aggregators, repeaters, and other devices that control transmission paths. Infrastructure also includes the software used to send, receive, and manage the signals that are transmitted.
In some usages, infrastructure refers to interconnecting hardware and software and not to computers and other devices that are interconnected. However, to some information technology users, infrastructure is viewed as everything that supports the flow and processing of information.
Infrastructure companies play a significant part in evolving the Internet, both in terms of where the interrconnections are placed and made accessible and in terms of how much information can be carried how quickly. http://searchdatacenter.techtarget.com/sDefinition/0,,sid80_gci212346,00.html
As from what I know, the university is using fiber-optic cable, which is a glass or plastic fiber that carries light along its length. Fiber optics is the overlap of applied science and engineering concerned with the design and application of optical fibers. Optical fibers are widely used in fiber-optic communications, which permits transmission over longer distances and at higher bandwidths (data rates) than other forms of communications. Fibers are used instead of metal wires because signals travel along them with less loss, and they are also immune to electromagnetic interference. Fibers are also used for illumination, and are wrapped in bundles so they can be used to carry images, thus allowing viewing in tight spaces. Specially designed fibers are used for a variety of other applications, including sensors and fiber lasers.
Light is kept in the core of the optical fiber by total internal reflection. This causes the fiber to act as a waveguide. Fibers which support many propagation paths or transverse modes are called multi-mode fibers (MMF), while those which can only support a single mode are called single-mode fibers (SMF). Multi-mode fibers generally have a larger core diameter, and are used for short-distance communication links and for applications where high power must be transmitted. Single-mode fibers are used for most communication links longer than 550 meters (1,800 ft).
Joining lengths of optical fiber is more complex than joining electrical wire or cable. The ends of the fibers must be carefully cleaved, and then spliced together either mechanically or by fusing them together with an electric arc. Special connectors are used to make removable connections.
Fiber optic cables can transfer data through light and it is more reliable compared to UTP (Unshielded Twisted Pair) or STP (Shielded Twisted Pair) cable, since it is more protected from disturbances and noises.
The only thing is, fiber optic cable cost you more when talking about expense and it is more sensitive, when cutting, so it needs a specialized cutter.
All in all I am contented with the Internet connection of the University, except for the fact that the Internet connection sometimes, time out. | |
| | | kristine_delatorre
Posts : 58 Points : 60 Join date : 2009-06-21 Age : 33 Location : davao city
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) Thu Oct 01, 2009 3:09 am | |
| " An IT consultant works in partnership with clients, advising them how to use information technology in order to meet their business objectives or overcome problems. Consultants work to improve the structure and efficiency and of an organsiation's IT systems. IT consultants may be involved in a variety of activities, including marketing, project management, client relationship management and systems development. They may also be responsible for user training and feedback. In many companies, these tasks will be carried out by an IT project team. IT consultants are increasingly involved in sales and business development, as well as technical duties."(1) In IT consulting, professionals are expected to Assist users to develop or use applications and software packages and their features. Next,Install, configure, and modify applications, networks, databases, and other systems. Moreover, Provide academic course management and related services to faculty. Also, Act as liaison and interface between faculty, staff, and information systems resources and staff. IT Consultants typical activities involves several task: - meeting with University Programmer to determine requirements; - working with the University's concerned personnel to define the scope of a project; - planning timescales and the resources needed; - clarifying a client's system specifications, understanding their work practices and the nature of their business; - travelling to customer sites; - liaising with staff at all levels of a client organisation; - defining software, hardware and network requirements; - analysing IT requirements within companies and giving independent and objective advice on the use of IT; - developing agreed solutions and implementing new systems; - presenting solutions in written or oral reports; - helping clients with change-management activities; - project managing the design and implementation of preferred solutions; - purchasing systems where appropriate; - designing, testing, installing and monitoring new systems; - preparing documentation and presenting progress reports to customers; - organising training for users and other consultants; - being involved in sales and support and, where appropriate, maintaining contact with client organisations; - identifying potential clients and building and maintaining contacts. if I were hired as the IT Consultant of the University, i'd rather suggest latest trends that are useful in our Internet Connection. Technologies, Steps and Simple Tips are essentials in making the Internet Connection more functional than before. Here are some latest technologies that would suit our University's Internet Connection: 1. WiFi (Wireless Fidelity) is an independent organisation that tests interoperability of 802.11 products. When a product has been demonstrated to interwork with other manufacturer’s compliant products, it will carry a “WiFi” label. SUggested Technology: - Dual Band 802.11 capability is embedded in Wireless LAN access points, routers and client adapters to provide seamless roaming between 802.11a and 802.11b networks. Dual band devices will also interoperate with 802.11g. Linksys WGA600N-RM Dual Band Wireless-N Gaming Adapter Review: Find out some price comparison of Linksys Dual Band Wireless-N Gaming Adapter (new and refurbished) items here. Because we have also published here a similar gaming adapter (Linksys WRT110 WiFi Wireles Router with RangePlus-n) we suggest you head on to that post and compare it with Linksys WGA600N-RM. The price of this nifty Linksys Dual Band Wireless-N Gaming Adapter on woot was $32.99 (refurbished or reconditioned item) + $5 shipping. Thought the price of 1 Linksys WGA600N-RM Dual Band Wireless-N Gaming Adapter on said site is cheap, the scenario is somewhat different in other sites. On Buy.com, you can have one of these networking gadgets for under $40. A huge savings still if you buy it on woot. Linksys WGA600N-RM | The Cheap Dual Band Wireless-N Gaming Adapter Linksys WGA600N-RM | The Cheap Dual Band Wireless-N Gaming Adapter Take for example on Amazon, the price there for new Linksys WGA600N-RM starts from $70.99 (new). The reconditioned ones are still expensive with a tag price of $65. Mind you, these amounts do not include shipping yet. According to its manufacturer, the Linksys Dual Band Wireless-N Gaming Adapter model WGA600N-RM has a dual capability and can connect through the 2.4Ghz and 5 GHZ (whichever is applicable in the user’s case. Consumer reviews are generally good despite some problems with its configuration, especially the assignment of an IP address. This is because Linksys model WGA600N-RM Dual Band Wireless-N Gaming Adapter “connects to your Ethernet-enabled Xbox, PlayStation, GameCube or other Ethernet-enabled gaming consoles – no drivers required.” Some enticing features of Linksys WGA600N-RM Dual Band Wireless-N Gaming Adapter include the following - Compatible with draft 802.11n, 802.11a, 802.11b and 802.11g compliant devices - Operates in the 2.4 GHz and 5 GHz frequency ranges for maximum flexibility - Dynamically shifts channels and wireless networks based on signal strength and link quality for maximum availability and reliability of connection - Secure Wireless LAN party gaming mode: one gaming adapter acts as a wireless AP, any further gaming adapter connects automatically using WPS – no computer required to setup the adapters - Utilizes up to 256-bit Wireless Protected Access (WPA/WPA2) to ensure security - Easy Setup with WPS (Wireless Protected Security) - Works with all network-ready game consoles, including Xbox, Xbox 360, Playstation 2* or 3, and GameCube* - 802.11g wireless broadband routers support higher speeds and more options than older 802.11b routers. If upgrading from 802.11b, you should notice faster file sharing and printing, and you can add more computers to an 802.11g network without bogging it down. 802.11g routers are not as fast or full-featured as newer 802.11n wireless routers. However, when you consider their solid performance, overall capabilities, and more affordable pricing, an 802.11g router may still be right for you.802.11n is the third-generation Wi-Fi standard for wireless home networking. 802.11n equipment is backward compatible with older 802.11g or 802.11b gear, and it supports much faster wireless connections over longer distances. So-called Wireless N or Draft N routers available today are based on a preliminary version of the 802.11n industry specification. They are not guaranteed to fully interoperate with future 802.11n products. All products listed in this category feature three MIMO radios and antennas that are the key feature of Wireless N routers, plus four-port 10/100 Ethernet switches for wired connections. WIFI's are now essentials to student's learnings. through their laptops.. its better to surf. Hassle FRee. 2. LAN TopoLogy Innovation Today, technology plays a central role in sparking the imagination, facilitating learning and creating new possibilities in education environments. In particular, networking technology can deliver to primary and secondary schools a wide range of vital broadband capabilities.The foundational requirements needed to achieve leading edge functionality for education include the advantages of high-speed local-area networks (LANs), the migration from hubs to switches in the wired environment, and the enhanced flexibility, mobility, portability, and scalability enabled by a combined wired and wireless infrastructure. Here are Some suggested Cisco Guidelines in this matter: - Learning—e-Learning has become a vital tool for education. Through e-learning, schools are providing students with tremendous flexibility, while extending their programs outside the classroom. - Online Content—In the school environment, online content is fast becoming the rule. This content—which may include curricula, tutorials, reference materials, and records—must be accessible to students and/or staff quickly and efficiently. - Multimedia Capabilities—Flat data files quickly are being replaced by rich multimedia content. Therefore, school system networks must be robust enough to support rich multimedia applications, as well as streaming media. - Converged Voice, Video and Data Applications—Converged voice, video and data applications, like IP telephony, enable far greater interaction among students, educators and parents. Some of the wide-ranging capabilities of such applications might be used to facilitate better parent communication and involvement or help ensure student safety. - Mobile Computer Labs—Increasingly, students need to access broadband applications outside of the computer room. Therefore, mobile computer labs are fast becoming popular in the school setting. With mobile computer labs, PCs and laptops are carted into individual classrooms, where students have full access to broadband capabilities through wireless connections. 3. Integrating Wireless LAN For many school environments, wireless technology is an important addition to the network. In a high-performance switching environment, wireless technology can deliver Ethernet-level speeds reaching 11 Mbps to open areas on the campus like informational kiosks in the quad or cafeteria. Typically, a wireless network cannot replace the wired LAN. However, it can dramatically improve the usability and scalability of the existing network. Many successful school implementations have shown that wireless technology delivers substantial administrative, learning, and cost-savings benefits. Advantages: Portable Computing: More Users with Fewer Connections wireless technology allows users to achieve total PC portability and location independence. Wireless allows schools to put computer resources wherever they are needed without hardwired connections for every computer. With a WLAN, a single hardwired drop linked to a wireless access point in any classroom provides a network access point for multiple PCs equipped with WLAN adapters. This type of configuration eliminates the location constraints of hardwired structures, and maximizes utilization of PC resources. As a result, laptops can be taken along and used in any location. In fact, schools can even set up mobile computer labs, in which laptops or PCs are carted into individual classrooms on an as-needed basis. Wireless broadband technology then can be brought to every classroom and every student, greatly enriching the learning experience for all subjects. In the ever-changing school environment, wireless technology can also reduce the cost and complexity of facility reconfigurations. Generally, wireless technology assumes three principal roles. First, schools add wireless to the LAN to give users greater mobility and flexibility in schools and school campuses. Secondly, wireless provides LAN access in buildings that are difficult to rewire for high-speed access. And, lastly, wireless bridges deliver LAN connectivity to remote sites and users. Each type of access can yield substantial benefits for students, faculty, and staff. 4. High-Speed LANs The wired LAN is the principal means of connection for the high-speed school LAN—even in an integrated wired and wireless environment. Compared to wireless technology, the wired LAN offers significantly higher transfer rates over both short and long distances. However, steps must be taken to achieve maximum performance for your wired LAN. For instance, the network infrastructure must be optimized with a configuration that delivers sufficient bandwidth and intelligence to meet increasing traffic demands. In addition, your wired LAN should feature: * Robust Quality of Service (QoS)—QoS enhances bandwidth management so that high-priority traffic receives preference on the network. * Continual Network Availability—Maintaining high availability is vital for any mission-critical school LAN. * Security—In a common infrastructure, data and application access must be restricted and protected. 5. Steps 1. Do some basic maintenance on your PC. Run Disk Defrag, a scan disk, a virus scan, a malware scan, and clear your recycle bin. An unusually slow Internet connection experience is often the only sign that your computer is infected with viruses or other malware. Delete old files and temporary files. Never allow the free space on your C: drive to be less than 10% of the total size or twice the installed RAM (which ever is larger). A well maintained PC will operate much better than a PC that has never had any maintenance. Google or your local computer repair store should be able to help you with this if you don't know how. 2. Reset Your Home Network. Sometimes restarting your home network if you have one will drastically increase the speed of your connection. 3. Optimize your cache or temporary Internet files. These files improve your Internet connection performance by not downloading the same file over and over. When a web site puts their logo graphic on every page your computer only downloads it when it changes. If you delete the temporary files it must be downloaded again. if you disable the cache, it must be downloaded every time you view a page that uses it. This can be done by opening Internet Explorer, clicking on "Tools" at the top and choosing "Internet Options". On the General tab, click the "Settings" button next to Temporary Internet Files. Set Check for newer versions to "Automatically". Set amount of disk space to use to 2% of your total disk size or 512 MB, which ever is smaller. On Firefox, click "Tools" then "Options," and go to the privacy tab. Then click on the Cache tab within this. 4. Never bypass your router. Most routers include a firewall that is very difficult for hackers to defeat. If you don't need to use Wireless then hook your computer directly to your router. Routers will only slow down your connection by a few Milli-seconds. You won't notice the difference but the hackers will. 5. If you are using a Wireless router, make sure it doesn't conflict with a cordless phone or wireless camera. Wireless routers come in two varieties; 802.11bg (2.4Ghz) or 802.11a (5.8Ghz) If you are using a 2.4Ghz Cordless phone and 2.4Ghz Wireless router then your Internet connection speed will slow while you use the cordless phone. The same is true of wireless security cameras. Check on your phone and camera, if it's 900Mhz then it's fine. If it says 2.4Ghz or 5.8Ghz then it could be the cause of your slow connection speed while they're in use. 6. Call your Internet service provider (ISP). Sometimes you just have bad service. They can usually tell if your connection is substandard without having a technician come to your home. Just be nice and ask. 7. Upgrade your computer. If your computer is slow, it doesn't matter how fast your Internet connection is, the whole thing will just seem slow. You can only access the Internet as fast as your PC will allow you to. 8. Replace your old cable modem. Any solid-state electronics will degrade over time due to accumulated heat damage. Your broadband modem will have a harder and harder time 'concentrating' on maintaining a good connection as it gets older (signal to noise ratios will go down, and the number of resend requests for the same packet will go up). An after-market cable modem as opposed to a cable-company modem will frequently offer a better connection. 9. Often your connection speed is slow because other programs are using it. To test if other programs are accessing the Internet without your knowing, Click Start, Click Run. Type "cmd" (without quotes). Type "netstat -b 5 > activity.txt". After a minute or so, hold down Ctrl and press C. This has created a file with a list of all programs using your Internet connection. Type activity.txt to open the file and view the program list. Ctrl Alt Delete and open up the Task Manager. Go to the process menu and delete those processes that are stealing your valuable bandwidth. (NOTE: Deleting processes may cause certain programs to not function properly) 10. After you have tried all this try your connection again and see if it's running any faster. 6. Tips - Call your ISP and have them verify all of your TCP/IP settings if you are concerned. Ask them to verify that your Proxy settings are correct. - Don't expect dial up or high speed lite service to be fast. The Internet is primarily geared towards Broadband Connections. Sometimes, you have to wait a little. - Download programs that make browsing faster: o Loband.org is a browser inside of a browser that loads web pages without the images. o Firefox and Opera both have options to disable images. o In Firefox, you can also use extensions such as NoScript that let you block scripts and plug-ins that would otherwise slow things down a lot. o If you are using Internet Explorer or Firefox, try downloading Google Web Accelerator. It is meant to speed up broadband connections, but it can also slow your Internet connection. Try enabling it and disabling it and see when your Internet connection runs faster. o If you are using Firefox, download the Fasterfox extension and Firetune. o Reduce the amount of programs running that use your Internet connection (Instant Messengers, RSS Feeders, and MS Applications set to send Internet data) o Google Accessible Is designed to search pages in order of how clean they are of junk. This will bring up pages that are usually not only easy to read, but are quick to load. - Upgrade your RAM. This will not only improve your regular computer use, but it will affect the speed of your Internet connection because your computer works faster. - Use the Stop button to stop loading pages once you've gotten what you want. - Some times malware on your computer can eat up your bandwidth. Make sure you have an up-to-date malware protection program. - Most Internet Providers have flaky DNS servers (no citation necessary, it's a given) - so, instead of using those provided by your ISP, switch your DNS servers to use those of OpenDNS. OpenDNS is far faster, and more reliable, simply using 208.67.222.222 and 208.67.220.220 as your domain name servers will speed up most flaky DNS problems (may even speed up your networking since OpenDNS has large caches). - Look into running your own local DNS server on your network. Some newer routers may include their own nameserver, otherwise, check into AnalogX.com's DNSCache program, it works great to hold commonly accessed domain names in the "cache" so that the IP addresses do not have to be looked up everytime you navigate to a new page. 7. Warnings - Viruses and malware can often use up your bandwidth and slow down your Internet connection. Make sure you have protection against this. Many ISP's will provide software for this. Make sure your anti-virus and malware scanners are up-to-date. - Bypassing the router will leave you more vulnerable to attacks because you no longer have the built-in firewall from your router protecting you. - Watch out for scams that claim to make your Internet go a lot faster for free. They may tell you to download their program, which usually has a lot of other hidden programs attached that might steal your identity. IT consultant is quite a difficult task.. References: http://www.wikihow.com/Maximize-the-Speed-of-Your-Internet-Connection http://www1.ous.edu/owpd/plsql/owpd_pos_desc?p_pos_id=111 http://www.kokeygadgets.com/featured/networking/linksys-wga600n-rm-review-linksys-dual-band-wireless-n-gaming-adapter/ http://compnetworking.about.com/od/wirelessrouters80211g/tp/80211ghome.htm http://www.cisco.com/en/US/products/hw/switches/ps646/products_quick_reference_guide09186a00800a8484.html | |
| | | janraysuriba
Posts : 34 Points : 35 Join date : 2009-06-20 Age : 36 Location : Davao City
| Subject: If you were hired by the university president as an IT consultant, what would you suggest (technology, infrastructure, innovations, steps, processes, etc) in order for the internet connectivity be improved? (3000words) Thu Oct 01, 2009 2:27 pm | |
| If the University President hired me as an IT consultant, I will suggest innovation and technology to improve the internet connectivity of the University.
The term innovation refers to a new way of doing something. It may refer to incremental and emergent or radical and revolutionary changes in thinking, products, processes, or organizations. Following Schumpeter (1934), contributors to the scholarly literature on innovation typically distinguish between invention, an idea made manifest, and innovation, ideas applied successfully in practice. In many fields, something new must be substantially different to be innovative, not an insignificant change, e.g., in the arts, economics, business and government policy. In economics the change must increase value, customer value, or producer value. The goal of innovation is positive change, to make someone or something better. Innovation leading to increased productivity is the fundamental source of increasing wealth in an economy.
Innovation is an important topic in the study of economics, business, design, technology, sociology, and engineering. Colloquially, the word "innovation" is often synonymous with the output of the process. However, economists tend to focus on the process itself, from the origination of an idea to its transformation into something useful, to its implementation; and on the system within which the process of innovation unfolds. Since innovation is also considered a major driver of the economy, especially when it leads to increasing productivity, the factors that lead to innovation are also considered to be critical to policy makers. In particular, followers of innovation economics stress using public policy to spur innovation and growth.
Those who are directly responsible for application of the innovation are often called pioneers in their field, whether they are individuals or organisations.
Goals of innovation
Programs of organizational innovation are typically tightly linked to organizational goals and objectives, to the business plan, and to market competitive positioning. For example, one driver for innovation programs in corporations is to achieve growth objectives. As Davila et al. (2006) note,
"Companies cannot grow through cost reduction and reengineering alone . . . Innovation is the key element in providing aggressive top-line growth, and for increasing bottom-line results" (p.6)
In general, business organisations spend a significant amount of their turnover on innovation i.e. making changes to their established products, processes and services. The amount of investment can vary from as low as a half a percent of turnover for organisations with a low rate of change to anything over twenty percent of turnover for organisations with a high rate of change. The average investment across all types of organizations is four percent. For an organisation with a turnover of say one billion currency units, this represents an investment of forty million units. This budget will typically be spread across various functions including marketing, product design, information systems, manufacturing systems and quality assurance.
The investment may vary by industry and by market positioning.
One survey[citation needed] across a large number of manufacturing and services organisations found, ranked in decreasing order of popularity, that systematic programs of organizational innovation are most frequently driven by:
1. Improved quality 2. Creation of new markets 3. Extension of the product range 4. Reduced labour costs 5. Improved production processes 6. Reduced materials 7. Reduced environmental damage 8. Replacement of products/services 9. Reduced energy consumption 10. Conformance to regulations 11. These goals vary between improvements to products, processes and services and dispel a popular myth that innovation deals mainly with new product development. Most of the goals could apply to any organisation be it a manufacturing facility, marketing firm, hospital or local government.
Failure of innovation
Research findings vary, ranging from fifty to ninety percent of innovation projects judged to have made little or no contribution to organizational goals. One survey regarding product innovation quotes that out of three thousand ideas for new products, only one becomes a success in the marketplace.[citation needed] Failure is an inevitable part of the innovation process, and most successful organisations factor in an appropriate level of risk. Perhaps it is because all organisations experience failure that many choose not to monitor the level of failure very closely. The impact of failure goes beyond the simple loss of investment. Failure can also lead to loss of morale among employees, an increase in cynicism and even higher resistance to change in the future.
Innovations that fail are often potentially good ideas but have been rejected or postponed due to budgetary constraints, lack of skills or poor fit with current goals. Failures should be identified and screened out as early in the process as possible. Early screening avoids unsuitable ideas devouring scarce resources that are needed to progress more beneficial ones. Organizations can learn how to avoid failure when it is openly discussed and debated. The lessons learned from failure often reside longer in the organisational consciousness than lessons learned from success. While learning is important, high failure rates throughout the innovation process are wasteful and a threat to the organisation's future.
The causes of failure have been widely researched and can vary considerably. Some causes will be external to the organisation and outside its influence of control. Others will be internal and ultimately within the control of the organisation. Internal causes of failure can be divided into causes associated with the cultural infrastructure and causes associated with the innovation process itself. Failure in the cultural infrastructure varies between organizations but the following are common across all organisations at some stage in their life cycle (O'Sullivan, 2002):
1. Poor Leadership 2. Poor Organization 3. Poor Communication 4. Poor Empowerment 5. Poor Knowledge Management Common causes of failure within the innovation process in most organisations can be distilled into five types: 1. Poor goal definition 2. Poor alignment of actions to goals 3. Poor participation in teams 4. Poor monitoring of results 5. Poor communication and access to information 6. Effective goal definition requires that organisations state explicitly what their goals are in terms understandable to everyone involved in the innovation process. This often involves stating goals in a number of ways. Effective alignment of actions to goals should link explicit actions such as ideas and projects to specific goals. It also implies effective management of action portfolios. Participation in teams refers to the behaviour of individuals in and of teams, and each individual should have an explicitly allocated responsibility regarding their role in goals and actions and the payment and rewards systems that link them to goal attainment. Finally, effective monitoring of results requires the monitoring of all goals, actions and teams involved in the innovation process.
Innovation can fail if seen as an organisational process whose success stems from a mechanistic approach i.e. 'pull lever obtain result'. While 'driving' change has an emphasis on control, enforcement and structure it is only a partial truth in achieving innovation. Organisational gatekeepers frame the organisational environment that "Enables" innovation; however innovation is "Enacted" – recognised, developed, applied and adopted – through individuals.
Individuals are the 'atom' of the organisation close to the minutiae of daily activities. Within individuals gritty appreciation of the small detail combines with a sense of desired organisational objectives to deliver (and innovate for) a product/service offer.
From this perspective innovation succeeds from strategic structures that engage the individual to the organisation's benefit. Innovation pivots on intrinsically motivated individuals, within a supportive culture, informed by a broad sense of the future.
Innovation, implies change, and can be counter to an organisation's orthodoxy. Space for fair hearing of innovative ideas is required to balance the potential autoimmune exclusion that quells an infant innovative culture.
== Measures of innovation == xyxyxyx There are two fundamentally different types of measures for innovation: the organizational level and the political level. The measure of innovation at the organizational level relates to individuals, team-level assessments, private companies from the smallest to the largest. Measure of innovation for organizations can be conducted by surveys, workshops, consultants or internal benchmarking. There is today no established general way to measure organizational innovation. Corporate measurements are generally structured around balanced scorecards which cover several aspects of innovation such as business measures related to finances, innovation process efficiency, employees' contribution and motivation, as well benefits for customers. Measured values will vary widely between businesses, covering for example new product revenue, spending in R&D, time to market, customer and employee perception & satisfaction, number of patents, additional sales resulting from past innovations. For the political level, measures of innovation are more focussing on a country or region competitive advantage through innovation. In this context, organizational capabilities can be evaluated through various evaluation frameworks, such as those of the European Foundation for Quality Management. The OECD Oslo Manual (1995) suggests standard guidelines on measuring technological product and process innovation. Some people consider the Oslo Manual complementary to the Frascati Manual from 1963. The new Oslo manual from 2005 takes a wider perspective to innovation, and includes marketing and organizational innovation. These standards are used for example in the European Community Innovation Surveys.
Other ways of measuring innovation have traditionally been expenditure, for example, investment in R&D (Research and Development) as percentage of GNP (Gross National Product). Whether this is a good measurement of Innovation has been widely discussed and the Oslo Manual has incorporated some of the critique against earlier methods of measuring. This being said, the traditional methods of measuring still inform many policy decisions. The EU Lisbon Strategy has set as a goal that their average expenditure on R&D should be 3 % of GNP.
The Oslo Manual is focused on North America, Europe, and other rich economies. In 2001 for Latin America and the Caribbean countries it was created the Bogota Manual
Many scholars claim that there is a great bias towards the "science and technology mode" (S&T-mode or STI-mode), while the "learning by doing, using and interacting mode" (DUI-mode) is widely ignored. For an example, that means you can have the better high tech or software, but there are also crucial learning tasks important for innovation. But these measurements and research are rarely done.
A common industry view (unsupported by empirical evidence) is that comparative cost-effectiveness research (CER) is a form of price control which, by reducing returns to industry, limits R&D expenditure, stifles future innovation and compromises new products access to markets.[8] Some academics claim the CER is a valuable value-based measure of innovation which accords truly significant advances in therapy (those that provide 'health gain') higher prices than free market mechanisms.[9] Such value-based pricing has been viewed as a means of indicating to industry the type of innovation that should be rewarded from the public purse.[10] The Australian academic Thomas Alured Faunce has developed the case that national comparative cost-effectiveness assessment systems should be viewed as measuring 'health innovation' as an evidence-based concept distinct from valuing innovation through the operation of competitive markets (a method which requires strong anti-trust laws to be effective) on the basis that both methods of assessing innovation in pharmaceuticals are mentioned in annex 2C.1 of the AUSFTA.[11][12]
Technology is a broad concept that deals with human as well as other animal species' usage and knowledge of tools and crafts, and how it affects a species' ability to control and adapt to its environment. Technology is a term with origins in the Greek technología (τεχνολογία) — téchnē (τέχνη), 'craft' and -logía (-λογία), the study of something, or the branch of knowledge of a discipline.[1] However, a strict definition is elusive; "technology" can refer to material objects of use to humanity, such as machines, hardware or utensils, but can also encompass broader themes, including systems, methods of organization, and techniques. The term can either be applied generally or to specific areas: examples include "construction technology", "medical technology", or "state-of-the-art technology".
The human species' use of technology began with the conversion of natural resources into simple tools. The prehistorical discovery of the ability to control fire increased the available sources of food and the invention of the wheel helped humans in travelling in and controlling their environment. Recent technological developments, including the printing press, the telephone, and the Internet, have lessened physical barriers to communication and allowed humans to interact freely on a global scale. However, not all technology has been used for peaceful purposes; the development of weapons of ever-increasing destructive power has progressed throughout history, from clubs to nuclear weapons.
Technology has affected society and its surroundings in a number of ways. In many societies, technology has helped develop more advanced economies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of the Earth and its environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism, anarcho-primitivism, and similar movements criticise the pervasiveness of technology in the modern world, opining that it harms the environment and alienates people; proponents of ideologies such as transhumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition. Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates and certain dolphin communities have developed simple tools and learned to pass their knowledge to other generations.
There are many innovative way to improve internet connectivity. We must find new ways in order to improve connection like downloading software to that speeds up internet connection. The university also needs new technology in order to meet the goal of improving the University’s internet connection. By buying new networking hardware and by proper networking set-up, surely will enhance the internet connectivity of the University. http://en.wikipedia.org/wiki/Innovation http://en.wikipedia.org/wiki/Techonology | |
| | | Sponsored content
| Subject: Re: Assignment 6 (Due: before August 19, 2009, 13:00hrs) | |
| |
| | | | Assignment 6 (Due: before August 19, 2009, 13:00hrs) | |
|
Similar topics | |
|
Similar topics | |
| » Assignment 8 (Due: August 28, 2009, 13:00hrs) » Assignment 5 (Due: before August 17, 2009, 13:00hrs) » Assignment 4 (Due: August 17, 2009, 13:00hrs) » Assignment 5 (Due: August 17, 2009, 13:00hrs) » Assignment 5 (Due: August 19, 2009, 13:00hrs)
|
| Permissions in this forum: | You cannot reply to topics in this forum
| |
| |
| |