Network Enhancers - "Delivering Beyond Boundaries" Headline Animator

Saturday, August 30, 2014

Know the Art of Wealth Creation

What is wealth?

Wealth is a state of mind. It is not strictly any number.

It is actually a feeling of “enough” - without fear or worry.

If your focus is on money as a measure of your wealth, a feeling of enough money is an indication for being wealthy. The truth is that money is a part of your wealth.

If one only had money they would starve. Aristotle (384 – 322 B.C.) contended that the components of wealth were coined money, property, livestock, implements, and slaves.

Rich Dad Poor Dad famous Robert Kiyosaki sees a difference between the rich and wealthy. According to him, “Rich might have lots of expenses that keep them up at night. Or they might have a high paying job but have to get up to work every day and have fear of getting fired or laid off” He further says that the wealthy don’t have these worries (like rich).

“The remedy for weakness is not brooding over weakness, but thinking of strength,” said Swami Vivekananda. Likewise if you want to become wealthy, your focus should be on “wealth” and not on “lack of it”

How?

If your heart is filled with abundance and joy, you got the key for real wealth. If you can imagine something, material wealth included, you can manifest it. Simply your dream becomes true. You can have everything in your life that you can possibly imagine.

The base for wealth building is peace of mind. While dreaming about wealthy future, don’t forget to live in this moment. Learn to enjoy the whole journey.

Are they doable?

Yes. You can do it. No longer, money is a dream. You will see it in abundance. All you need to do is just 3 easy steps.

STEP 1 is a step towards peace of mind. Protect yourself and your family from all known and unknown adverse financial circumstances, within your means. Being aware of it and taking steps actually give you immense peace of mind. What are they?


  • Money to cover basic needs
  • Your own roof (home) over the head
  • Risk protection mechanism in place to deal with unknown life events


STEP 2 is passive income. Stop working for money. Let the money work for you! It has worked that way for several imaginative people. That’s why Albert Einstein said, “Imagination is more important than knowledge”

Where can I find my passive income? If this is your question, here’s the answer.

You can find passive income in

Real Estate
Business
Investments &
Internet
How?

I can show you, when we get connected. But don’t forget to ask me about this.

STEP 3 is perpetual income. It means a continuous flow of income without any interruption.

Will it happen?

Yes; it will. First you have to achieve success in step 1 & 2. Then you have to understand, how you achieved it. Finally systematize your success and repeat. systematizing step 1 and 2 is actually step 3.

Friday, August 29, 2014

IOx: Cisco Creates A Networking OS To Help Manage IoT


Cisco has developed a hybrid router operating system to help control and manage the Internet of Things (IoT) devices from Cisco edge routers and other networked devices. Cisco announced yesterday that it's IOx, an OS that combines the open source Linux OS and the Cisco IOS network OS, will allow customers to create and run applications directly on Cisco industrial networked devices.
According to Cisco's estimates, which the company considers to be conservative, there will be 50 billion connected devices by 2020 which will create even more data that will be expensive to move, store, analyze, and convert into useful content.
The challenge is managing both the devices and the huge amount of data that comes out of them, and doing so efficiently. Currently most monitoring, analysis, and decision making occurs from a central location. Data from a device is polled, transmitted, received, and verified; and if certain parameters are exceeded, the system doing the monitoring responds by sending instructions back to the device, with additional transmissions occurring for confirmation of receipt of instructions, escalations, and other actions.
Cisco's Fog computing concept works under the idea that some of that monitoring, analysis, and response can occur more efficiently, and with less cost, at the networks' edge; closer to the devices being monitored. Cisco Fog, which the company describes as a cloud that is closer to the ground, is a distributed computing infrastructure for applications. It allows network devices such as hardened routers, switches, and IP video cameras to manage the huge amount of data expected to be generated by people and devices in the Internet of Everything (IoE).
Since most monitoring involves checking to make sure that data is within normal limits, a local networked router could be programmed to analyze data being collected from IoT devices and only act on data that falls outside of normal parameters. By keeping the processing and the data local, latency issues would be reduced which would result in faster response times in the event some action were required. Additionally, data that did not need to be sent for analysis, or that had no value, could be discovered and discarded locally, thereby reducing some unnecessary traffic over the internet.
Some real world example applications given by Cisco for their Fog computing strategy include:
  • Video cameras that can sense an approaching ambulance's flashing lights could change traffic lights to allow the emergency vehicle to pass through.
  • Energy load balancing that would switch to alternative energy sources based on energy demand and lowest prices.
  • Life saving air quality monitoring in mines that would automatically change airflow if conditions became dangerous to miners.
Recognizing that it would be impossible for Cisco to create and manage every possible application, the company is creating and supporting an open application environment to encourage developers to port existing applications and create new ones across various industries, including the manufacturing, utilities, and transportation sectors.
"Cisco is very excited to accelerate innovation in the Internet of Things by delivering IOx, which provides the ability to combine computation and communication on our ruggedized routers and other devices. We believe that this turns the network into the fourth platform for computing (in addition to PCs, mobile and cloud), which will unleash new applications in manufacturing, transportation, smart cities and many other industries," says Guido Jouret, general manager of Internet of Things Business Group at Cisco.
The Cisco IOx is expected to be available in Cisco industrial routers this spring.


Thursday, August 28, 2014

Cisco Introduces the OpFlex Protocol for Next Gen Networks


Cisco introduced a new open standards-based protocol for Application Centric Infrastructure (ACI) called OpFlex, which the company submitted to the Internet Engineering Task Force (IETF) for consideration last week.

Cisco has a stake in the future of software defined networks (SDN), so it is not surprising the company is taking a lead in helping to create protocols and standards that work within SDN. However, OpFlex is not just a Cisco initiative. It has a few well-known backers such as Microsoft, IBM, and SunGard who are, according to Cisco, actively involved in the standardization process.

[Read: Software Defined Networking: Introduction to OpenFlow]

The OpFlex standards are to help fill a gap in the current SDN model, which Cisco describes as "an imperative control model with a centralized controller and distributed network entities that support the lowest common denominator feature set across vendors such as bridges, ports and tunnels."

The issue Cisco sees with the existing imperative control model is when a network grows (scales) the controller becomes a bottleneck due to the need to handle more processes. The larger the network grows, the bigger the impact on the controller's performance, which then negatively affects the network.

Additionally, the current SDN model requires application developers to write code to describe application, operation, and infrastructure requirements in low-level constructs the controller will understand. This introduces added manual steps and learning processes that can have a negative impact on an organization's ability to make quick changes to infrastructure or to manage growth within the data center.

The ACI model abstracts applications, operations, and infrastructure, which is another way of saying, to keep things simple the model only wants minimum essential information about each item it interacts with in order to interact with it. With the OpFlex standard, the complexity remains with the distributed edge devices and allows for additional network resiliency by avoiding single points of failure or as Cisco puts it, "the data forwarding can still continue to happen even if there is no controller." Additionally, the ACI model would automatically deliver self-documenting policies that are deployed or cleaned up from devices as needed.

Finally, Cisco also announced that the following products will support the new protocol:

Cisco Application Centric Infrastructure, Nexus 9000 Series;
Cisco Nexus 1000V;
Cisco ASR 9000 Series;
Cisco Nexus 7000 Series;
Cisco ASA;
Cisco SourceFire.

For more information on OpFlex, check out this whitepaper from Cisco and this blog post from Shashi Kiran.


Wednesday, August 27, 2014

Best Computer Forensics Certifications for 2014


Computer forensics is one of the more challenging IT disciplines, and certified professionals remain in high demand. Yet computer forensic certifications remain something of a "wild" frontier. From two dozen available credentials, we list the five best options for 2014.

Today, there are a number of high-quality certification programs that focus on digital investigations and computer forensics. However, there are also certifications and programs in this area that are far less transparent, well-documented and widely known.

What's creating this demand for new programs in computer forensics? Consider the following:
  • Computer crime continues to be a major growth area. As more cyber crimes get reported, more investigations and qualified investigators are needed. This is good news for law enforcement and private investigators who specialize in computer forensics.
  • There's a high demand for qualified computer forensics professionals as nearly every police department is in need of a trained candidate with suitable credentials.
  • IT professionals interested in working for the federal government (either as full-time employees or private contractors) must meet certain minimum training standards in information security.Computer forensics qualifies as part of the mix needed to meet such requirements, which further adds to the demand for certified computer forensics professionals.

As a result, there is a continuing rise of companies that offer computer forensic training and certifications, many of which are "private label" credentials that are not well recognized. Making sense of all the options and finding the certification that's right for you might be more difficult than it seems.
A recent survey we conducted for SearchSecurity.com (June 2013) on available information security certifications turned up just under two dozen computer forensics and anti-hacking credentials.And these are all somewhat well-known, and on the up-and-up. But in pulling those materials together, we deliberately ignored programs that didn't publish the sizes of their certified populations or that are associated with mandatory high-dollar training. (A small certified population usually means the program is just getting started or not doing very well. We generally look for programs with no fewer than 5,000 certified professionals, by contrast. Expensive training sometimes indicates there's a strong profit or financial motive in signing people up for certification.)
After a closer analysis of all of the available programs out there, we've identified the five best computer forensics certifications for 2014.
Certified Computer Examiner (CCE)

The Certified Computer Examiner (CCE) comes from the International Society of Forensic Computer Examiners, aka ISFCE.
It is well-recognized in the industry and in the law enforcement community as a leading credential for computer forensics professionals. Private-sector holders usually include security officers and managers, IT administrators or managers, security or forensics consultants, systems and data security analysts and investigators, and even some lawyers and HR managers. Law enforcement holders usually serve as forensic investigators, analysts, or technicians, and conduct official investigations to research or prosecute computer crimes.
The certification process for the CCE includes both a proctored online multiple-choice exam and hands-on forensic analysis of a floppy or CD-R optical disk. When the online exam is completed, applicants conduct a thorough forensic examination of the test media with which they are supplied. Together, both written and hands-on portions are intended to verify a candidate’s skills and knowledge in the area of computer forensics.
The CCE training classes usually run for 5 days in the classroom (or 40 hours of online or self-paced materials). Instructor-led versions generally cost $2,500 to $3,500 in North America, and are highly regarded. Online or self-paced versions may be somewhat less expensive but don’t always deliver direct instructor contact.
Table 1: Certified Computer Examiner (CCE)
Certification Name
Certified Computer Examiner (CCE)
Prerequisites/
Required courses
One of the following is required:
  • Training at a CCE Bootcamp Authorized Training Center
  • Proof of other digital forensics training (can be self-study); must be approved by the Certification Board
  • At least 18 months of verifiable experience conducting digital forensic examinations
Candidates cannot have a criminal record.
Number of Exams
2 exams are required to earn the CCE:
  • Online Written Exam (75 questions in 45 minutes; 70% passing score required to proceed to Practical Examination)
  • Practical Examination (three actual scenarios to investigate)
Average score of 80% for both exams is required to earn the CCE
Cost per Exam
$395 USD for both exams in the USA; prices vary by location elsewhere
URL
Self-study Materials
There are no books or Exam Crams available for this topic, but the ISFCE publishes a complete list of suggested study materials (books and online materials), all of which are readily available to the public:www.isfce.com/study.htm

Certified Hacking Forensic Investigator (CHFI)

The International Council of E-Commerce Consultants, aka EC-Council, is a well-known training and certification organization that specializes in the areas of anti-hacking, computer forensics, and penetration testing.
The Certified Hacking Forensic Investigator (CHFI) certification emphasizes forensics tools, analytical techniques, and procedures involved in obtaining, maintaining, and presenting computer forensic evidence and data in a court of law. EC-Council offers training for this credential but permits candidates to challenge the exam without taking the course, though payment of a non-refundable $100 application fee is required.
The CHFI course runs for 5 days and covers a wide range of topics and tools (a detailed course description is available). Topics include a comprehensive cyber-crime overview, in-depth coverage of the computer forensics investigation process, search and seizure of computers, working with digital evidence, incident handling and first responder procedures, gathering volatile and non-volatile data from a Windows computer, recovering deleted files and partitions from Windows, Macintosh, and Linux systems, using AccessData FTK and Encase Steganography, password cracking, log capturing tools and techniques, investigating network traffic, wireless attacks, Web attacks, and e-mail crimes. Courseware is available, as well as instructor-led classroom training.
EC-Council also offers numerous other related certifications of potential value to readers interested in the CHFI. These include the Certified Ethical Hacker (CEH), EC-Council Certified Security Analyst (ECSA), EC-Council Certified Incident Handler (ECIH), and Licensed Penetration Tester (LPT). Visit the EC-Council site for more information on these popular and respected credentials.
Table 2: Certified Hacking Forensic Investigator (CHFI)
Certification Name
Certified Hacking Forensic Investigator (CHFI)
Prerequisites/Required Courses
Training is not mandated to earn the CHFI, but it is both available and recommended. The training class costs about $3,000 for instructor-led classroom training, $1,600 for online self-paced training, or $650 for self-study courseware. Mobile courses are also available, which start at $2,000.
Number of Exams
1 exam: 312-49 or EC0-349 (150 questions, 4 hours, passing score 70%, multiple choice)
Cost per Exam
$500 (or local currency equivalent) at either Prometric or Pearson VUE testing centers
URL
Self-study Materials
CHFI Study Guide available at Amazon; some practice exams also available.
Certified Forensic Computer Examiner (CFCE)

The International Association of Computer Investigative Specialists (IACIS) is the organization behind the Certified Forensic Computer Examiner (CFCE) credential. This organization caters primarily to law enforcement personnel (and you must be employed in law enforcement to qualify for regular IACIS membership), but the organization also offers associate membership to retired law enforcement personnel and to full-time contractors to law-enforcement organizations.
A formal application form, along with an application fee, is necessary to join IACIS. Those who are not current or former government or law enforcement employees or are not forensic contractors to a government agency can apply for Associate Membership to IACIS, provided they can pass a background check.
Earning the CFCE requires passing a two-step testing process that includes a Peer Review and CFCE certification testing. Peer Review consists of accepting and completing assigned problems based on core knowledge and skills areas for the credential. These must be solved, and presented to a mentor for initial evaluation (and assistance, where needed) before being presented for peer review. Upon successful conclusion of peer review, candidates must work independently to analyze and report upon a forensic image of a hard drive provided to them. Following specific instructions, a written report is prepared to document the candidate’s activities and findings.
Once that report is accepted and passed, the process concludes with a written examination. A passing score of 80 percent or better is required for both the forensic report and the written exam to earn the CFCE.
Despite the time and expense involved in earning a CFCE, this credential enjoys high value and excellent name recognition in the computer forensics field. Many forensics professionals consider the CFCE to be a necessary "merit badge" to earn, especially for those who work in or for law enforcement.
Table 3: Certified Forensic Computer Examiner (CFCE)
Certification Name
Certified Forensic Computer Examiner (CFCE)
Prerequisites/Required Courses
Training is not mandated to earn the CFCE, but it is both available and recommended. The two-week Basic Computer Forensic Examiner (BCFE) instructor-led classroom training costs $2,795. (IACIS membership is required to take IACIS courses.)
If you choose not to attend the training, you can enter the program by registering and paying the $750 registration fee.
Number of Exams
The CFCE involves a two-part process, taken in this order: Peer Review problems and practical examination, and a written certification exam. Satisfactory performance on the Peer Review phase is required to advance to the certification exam. Satisfactory completion of both phases is required to earn the CFCE.
Cost per Exam
$750 for the entire examination process (non-refundable once an application is submitted)
URL
Self-study Materials
IACIS is the primary conduit for training and study materials for this certification. No books or practice tests are currently available for the CFCE, but Ed2Go offers an online preparation class.

GIAC Certified Forensic Analyst (GCFA)

SANS is the organization behind the Global Information Assurance Certification (GIAC) programs, and is a well-respected and highly regarded player in the information security field in general. SANS not only teaches and researches in this area, it also provides breaking news, operates a security alert service, and serves on all kinds of government, research, and academic information security task forces, working groups, and industry organizations.
The organization's forensics credentials include the intermediate-level GIAC Certified Forensic Analyst (GCFA) and the more senior GIAC Certified Forensic Examiner (GCFE). Neither credential requires taking SANS courses (which enjoy a strong reputation as among the best in the information security community, with high-powered instructors to match) but they are recommended to candidates, and often offered before, during, or after SANS conferences held around the USA at regular intervals.
Both GCFA and GCFE focus on computer forensics in the context of investigation and incident response, and thus also focus on the skills and knowledge need to collect and analyze data from Windows and Linux computer systems in the course of such activities. Candidates must possess the necessary skills, knowledge, and ability to conduct formal incident investigations and advanced incident handling, including dealing with internal and external data breaches, intrusions, and advanced persistent threats, understanding anti-forensic techniques, and building and documenting advanced digital forensic cases.
The SANS GIAC program encompasses more than 60 information security certifications across a broad range of topics and disciplines. IT professionals interested in information security in general, as well as computer forensics in particular, would be well advised to investigate further at the GIAC home page.
Table 4: GIAC GCFA and GCFE
Certification name
GIAC Certified Forensic Analyst (GCFA)
GIAC Certified Forensic Examiner (GCFE)
Prerequisites/Required Courses
None.
Course FOR508: Advanced Computer Forensic Analysis and Incident Response is recommended for the GCFA
Course FOR408: Computer Forensic Investigations – Windows In-Depth is recommended for the GCFE
Number of Exams
1 exam for each credential (115 questions, 3 hours)
Passing score of 69% for GCFA
Passing score of 71% for GCFE
Cost per Exam
$999 (or local currency equivalent) for challenge exam (no training). GCFA and GCFE certifications taken in conjunction with the SANS training class, which range from $5,000 to $5,500) are $579. Recertification attempts are $399.
URL
Self-study Materials
Study guides and practice exams are available for both GCFA and GCFE, through Amazon and other typical channels.

Professional Certified Investigator (PCI)

The parent organization for Professional Certified Investigator (PCI), the senior level computer investigations and forensics credential is known as ASIS International.
Founded in 1955, with more than 38,000 members world-wide, the former American Society for Industrial Security (now known simply as ASIS International) is one of the oldest and best-known information security bodies around. The PCI's primary focus is on investigations and includes coverage of case management, investigative techniques and procedures, along with case presentation.
ASIS adopted a book called The Professional Investigator’s Manual as its sole information source for this exam.
The PCI exam devotes 29% of its coverage to case management (analyze for ethical conflicts, analyze and assess case elements and strategies, determine and develop strategy through review of procedural options, manage and implement necessary investigative resources), 50% to investigative techniques and procedures (electronic and physical surveillance, interviews and interrogations, collect and preserve objects and data, research by physical and electronic means, collect and report relevant information, use computers and digital media to gather information and evidence, and more), and 21% on case presentation (prepare report to substantiate investigative findings, and prepare and present testimony). This is the only credential in this article that really teaches people how to analyze and present information for expert reports and testimony.
For practiced or aspiring computer forensics professionals who wish to work on legal cases involving their expertise, especially if they wish to appear in depositions or at trial, the PCI is THE absolute credential to have. It makes sure holders are ready to withstand the rigors of the legal system, and present themselves and their evidence with best results in the courtroom.
Table 5: Professional Certified Investigator (PCI)
Certification Name
Professional Certified Investigator (PCI)
Prerequisites/Required Courses
Candidates must meet qualification requirements for PCI as well, which include:
  • 5 years of investigation experience, with two or more years in case management
  • High school diploma or GED equivalent
No course is required but the following training is available from ASIS:
  • An online review course ($400 for members, $550 for non-members)
  • A CD ($500 for members, $650 for non-members)
  • A two-day classroom review ($725 for members, $925 for non-members; discounts apply for early sign-up)
Number of Exams
1 exam (125 multiple-choice questions, no info about duration)
Cost per Exam
Computer-based testing: $300 ($450 for non-members); $200 ($350 for non-members) for subsequent retesting
Pencil and paper testing: $200; $100 for subsequent retesting
URL
Self-study Materials
Professional Investigator's Manual, ISBN 978-1-934904-02-2 (hardcover $105 member, $175 non-member; softcover $64 member, $93 non-member)

Beyond the top 5 computer forensics certifications listed in this article, there are lots of other certification programs that can help to further the careers of IT professionals who work in computer forensics. 
In particular, credentials from Access Data (Access Data Certified Examiner: ACE) and EnCase(EnCase Certified Examiner: EnCE) are worth pursuing for those who already use (or plan to use) the forensics toolsets and platforms available from those vendors. Access is well known for its FTK Forensic Toolkit, which enjoys considerable use in law enforcement and private research and consulting firms. The same goes for the EnCase Guidance software, which is also very widely used in the field as well.
And if you look around online, you'll find numerous other forensics hardware and software vendors that offer certifications and plenty of other organizations that didn't make the cut for the 2014 list of the best computer forensics certifications. But before you wander outside the items already mentioned in this article, you might want to research the sponsoring organization's history, the number of people who've earned its credentials, and determine whether or not the sponsor not only requires training but stands to profit from its purchase.
You might also want to discuss with a practicing computer forensics professional as a final check, to ask them if (a) they've heard of your other candidate and (b) if so, what they think of their offerings.
If you do your homework, you won't get burned. Certified computer forensics professionals are sure to remain in high demand for 2014 and beyond.


Tuesday, August 26, 2014

Effective mobile device management for enterprises


Intro

Mobile devices have become an essential facet of modern business operations, used by staff for everything from collaboration to communication. It’s now essential that the proper level of attention is given to mobile device management.

Failure to correctly handle the implementation and ongoing upkeep of smartphones, tablets and mobile devices could lead to lower productivity and possibly even security shortcomings.

This article will explore the growth of mobile devices in enterprise use, different applications within companies and processes and solutions for correct management.

Mobile device management cannot be ignored, and requires the attention of the enterprises operating the technologies.

Growth of mobile devices

Mobile devices are growing in business use, with new highs expected to be reached over the remainder of this year. A new report from research organisation Gartner has outlined the growth of several mobile technologies.

Mobile phone sales have been estimated to reach 1.9 billion units in 2014, representing a 3.1 per cent increase from 2013. Smartphones in particular are expected to make up 88 per cent of global phone sales by 2018.

On the other hand, tablets are expected to slow down, hitting 256 million units. This is a 23.9 per cent increase from 2013.

The next few years could see new mobile devices enter the market, such as wearable computers. These could certainly impact enterprises as they continue to advance and see increased adoption.

Mobile devices within a company

Few technologies have had such a massive impact on businesses as mobile devices, and it’s a certainty that they’ll continue to disrupt operations in the near future.

Mobile devices can overhaul how companies operate, improving communication throughout all levels of the organisation, and enabling greater productivity and collaboration regardless of location.

In addition, the devices can also work seamlessly with new cloud platforms, enabling staff to access necessary information and applications from anywhere with an internet connection. Again, improved devices and faster networks will continue to drive innovation.

Here are two uses for mobile devices within an enterprise:

• Communication – Staff can access enhanced communication through a number of platforms, including video conferencing, calling or messaging.
• Collaboration – Mobile devices can access productivity tools over the internet, so workers can edit documents with other uses simultaneously. Many new platforms operated by enterprises also have cloud and mobile support.

Mobile Device Management

With the proliferation of mobile devices within the business environment, the need for an effective management solution is clear. Mobile device management enables enterprises to manage the rapidly expanding mobile market, and ensure continued effectiveness and ongoing safety.

An effective mobile device management strategy should enable visibility of all mobile device operations within a business, and provide a strong level of security.

Ideally, a strategy can be used to set up devices initially, regardless of operating system, and maintain ongoing security and maintenance as required. Application scanning is also a useful tool, as it can check devices for any potentially harmful applications.

The correct management tools

Using IBM Endpoint Management and IBM Composite Application Management is one of the best approaches for enterprises.

These technologies, along with foundation infrastructure monitoring, ensure a comprehensive management solution is put in place throughout the IT environment.

Endpoint and infrastructure management systems can lower the cost of managing and securing enterprise mobile devices, as well as servers, laptops and desktops. In addition, these systems operate smoothly regardless of whether devices are owned by the company or individuals.

Other benefits also require consideration, such as the ability for companies to quickly isolate problems as they arise. In addition, services can also be prioritised so alerts are generated only when an issue occurs.

Summary

Mobile devices offer companies a significant number of benefits, but to see effective ongoing management it’s essential that the correct systems are place.

Juniper's Latest SDN Offerings Cater To Service Providers


Juniper Networks announced the expansion of its software defined networking (SDN) program with new hardware and software designed to help communication service providers (CSPs) accelerate service creation and network utilization. The products and services announced at the Mobile World Congress yesterday are also designed to help carriers reduce both capital expenditure and operational spending.
Due to a flood of data from cloud computing, mobile and internet connected devices, and high definition video streaming, CSPs are struggling to stay profitable and are looking for ways to lower operational costs.
Juniper Networks will expand on its SDN portfolio to help service providers build high-IQ networks and cloud environments. The networks are designed to be secure, automated, and scalable, enabling rapid provisioning based on actionable intelligence. The solutions from Juniper would also help carriers address future needs as additional traffic demands increase.
Juniper is offering Juno Fusion, an automation tool designed to control thousands of both Juniper and independent network elements from a single management plane. According to Juniper, this software will reduce network complexity and operational costs by collapsing the underlying transport elements (microwave, mobile, optical, etc.) into a single point of control from the Juniper Networks MX Series or PTX Series routing platform. Junos Fusion will be available in the second quarter of 2014.
Juniper is also unveiling its Juniper Networks NorthStar Controller that will leverage open industry standard protocols. Based on operator defined performance and cost requirements, the NorthStar Controller will automate the process of identifying and programming optimal paths within multi-vendor networks. Service providers will avoid over provisioning and better support mobile and cloud traffic.
The NorthStar Controller integrates WANDL technology and includes intelligent analytics and modeling capabilities to enhance the performance and cost-effectiveness of an operator's network infrastructure.
Juniper is also introducing a 1TB line card adding further scale to the PTX series router. The new line card doubles the PTX router's per-slot capacity to provide operators with more usable per-path bandwidth. This card will be available in the second quarter of 2014.
To create value for CSPs, providers can use the new Junos Video Focus, Junos Subscriber Aware, Junos Application Aware and Junos Policy Control products available on the MX Series routers and allow operators to use the MX in the Gi-LAN as a Service Control Gateway (SCG). This will allow service providers to customize services based on both who is using the network as well as what they are using it for. The SCG functions will be available in mid-2014.
Juniper Networks will provide professional service to help customers move to these highly customizable networks. No information was provided regarding the cost of the new products.

Software Defined Networking: Introduction to OpenFlow

The Road to SDN

Computer networking has undergone monumental changes in the last few decades. The next networking evolution is here and it's Software Defined Networking, or SDN, enabled by OpenFlow. Here's what SDN means and how OpenFlow can be implemented in the real world


Over the last several decades there have been a large number of changes in networking technology that led to the development of Software Defined Networking (SDN). This layered evolution, along with other advances in data center technologies, would produce the underlying foundations needed to support today's infrastructures.

The lower layers have evolved from slow Time Division Multiplexing (TDM) circuits to high speed optical and copper Ethernet on the Wide Area Network (WAN). And from thick coax-based shared Ethernet and Token Ring networks to Unshielded Twisted Pair (UTP) and fiber based full duplex dedicated 1, 10 and 100 Gigabit Ethernet on Local Area Networks (LANs).

At the upper layers, this has included the Internet Protocol Version 6 (IPv6) and a flurry of different protocols, such as Open Shortest Path First (OSPF), Spanning Tree Protocol (STP), and Multiprotocol BGP (MP-BGP), to manage the ever changing requirements of highly networked enterprise infrastructures and end-user devices.

When configured, these technologies outside of the modern world of regulations and ever changing service requirements work very well. As the networks of the last twenty years have changed, they have been modified by the wide scale use of technologies like STP, Virtual LAN (VLAN), OSPF, Border Gateway Protocol (BGP), and more recently technologies like Virtual Private Networks (VPN), Virtual Routing and Forwarding (VRF), Multiprotocol Label Switching (MPLS) and MP-BGP.

While these technologies work well on their own, they all require some amount of individual node configuration, including a high level of local computing capacity to operate and manage their functions, adding even more overhead to growing networks.

Software Defined Networking is an evolving data center solution that moves most of the local complexities back to a centralized, controlling device or devices. With SDN the software that resides on these controllers makes the higher level decisions and sends this information down to each physical device.


SDN, OpenFlow and APIs


As there are a number of different definitions for SDN, for the purpose of this article, it will be defined as a method of network management that decouples the control and data plane functionalities of a network device.

The control plane functions of these network devices include processes like making frame routing decisions by STP and packet routing decisions by BGP and OSPF. It also includes other things like controlling buffers and queues as well as traffic separation with technologies like VPNs, VLANs, VRF and MPLS. On modern networks, all of these are typically configured and decisions are made at the network device itself (locally). Once these different technologies do their work, the device is also tasked with performing the process of switching and/or routing the traffic based on those decisions that are made; this movement (and sometimes short term storage) of traffic happens at the data plane.

SDN allows this control plane functionality to be copied and/or moved to a centralized controller; this controller -- the software in software defined networking -- is then responsible for making the initial decisions as to how traffic is switched or routed. This behavior is then sent to the data plane device (typically a switch at the moment) where it is cached and performed.

OpenFlow is the main interface protocol that handles communication between the control and data layer devices (the controller and the switch) and defines how a controller and device will communicate with each other to ensure that the traffic flow is switched/routed and/or controlled as expected. Keeping in mind that the controller can be configured as broadly, altering the behavior of whole subnets, or as narrowly, altering the behavior of a single device, as is required for a specific flow of traffic.

SDN is also behind a new and fast evolving trend that revolves around the use and availability of application layer interfaces (APIs) which are specifically developed to allow organizations to granularly control vendor devices. The Open Network Foundation (ONF) is at the center of the SDN movement and has provided a well-designed illustration of the API-integrated SDN architecture shown in Figure 1:

Figure 1 - Software Defined Networking Architecture (Source: Open Networking Foundation)
As illustrated, APIs provide direct access to the controller, and in turn the switching and/or routing device. The uses of these APIs are rather infinite, assuming support is offered for the specific platform deployed. As many of these APIs are still young, the real power that they can provide will most likely not be seen in the short term as they continue to be developed, tested and as they go through some limited deployment. But if they continue to be developed and supported, then the room for their expansion is not to be underestimated.

The big thing to follow at this point is whether the larger vendors will continue their support for this process and provide for comprehensive API support into their hardware. The problem with SDN is the suggestion that many parts of these vendor’s product lines become redundant and unneeded, and what takes their place is a comprehensively programmed and managed controller platform. Since many of these vendors have a lot of their intellectual property tied up in device software, this can be a problem. The smart money will be on a hybrid solution being developed that enables the benefits of SDN while also leveraging the power of some devices having local device intelligence.


SDN and OpenFlow Use Cases

There are certainly a large number of use cases for Software Defined Networking that illustrate multiple ways problems can be addressed by using SDN with OpenFlow; the next couple of sections will describe some of them.

Cloud Backbone Provider

The design of networks within data centers has advanced over the last several years; platforms have changed considerably (from physical to virtual) and the shear amount of traffic that is required to be moved has grown exponentially. Traditionally, methods of network design work rather well within a data center as most of these locations are interconnected with fiber already, making intra-data center bandwidth reasonably cheap.

But modern network traffic flows are not limited to a single data center's infrastructure. A lot of traffic not only flows from the data center to a host workstation (or PC), but between data centers (inter-data center) as well.

Traditionally the WAN connectivity between data centers has been provisioned in one of two ways:


  • Capacity is ordered as needed.
  • Capacity is ordered initially with a good amount of it being overprovisioned compared with the current needs of the system.


The problem with waiting until the capacity is needed is that, traditionally, the process of provisioning takes time, and often the amount of time that it takes is too long for the corporate end-users waiting on their traffic. On the other hand, the problem with overprovisioning the capacity is that often this part of the network remains in a dormant state and is not used and becomes a waste of money.

What if a cloud service provider could implement an SDN solution which allows a company to provision their needed capacity in a very small window of time (thus reducing the need to buy excessive unused capacity)? This could even be something that is built-in to a providers' management portal.

On the providers' network, the whole network could be interconnected with OpenFlow capable devices which are centrally managed; if a business customer asks for an additional 10 Mbps of capacity to be allocated from data center X to data center Y, then the central management system would communicate to the OpenFlow devices and allocate the capacity on-the-fly without any access to the individual devices being required. Keep in mind that this is not limited to just bandwidth, but could also be implemented with Quality of Service (QoS).

Private and Hybrid Cloud

The private cloud is another topic that has gotten a lot of attention lately. Many organizations are changing the way that they implement and manage their IT departments by consolidating the operations into a single private cloud of potential services. This private cloud is then used as an internal service provider which individual departments provision services from. This is compared against IT departments that are tasked with providing many different solutions to multiple departments, making it almost impossible to adequately support all types of services out of the same department.

The use of SDN-enabled components within a private cloud makes a lot of sense in these implementations as they allow an organization to specifically tailor the services that will be offered to the interior departments (the same way a public cloud provider tailors theirs); this can include everything from services like bandwidth management to QoS to Security. When implemented correctly, it certainly has the potential to lower the IT departments administrative overhead and bring down department response and ticket resolution times.

On the other hand, hybrid cloud comes in when an infrastructure needs to extend to a public cloud for some of its services. Using this traditionally can run into the same problems that were addressed in the private cloud, but what if the services could be provisioned on-the-fly as well; for example what if a public store was going to be used for external storage or for offsite resource expansion?

An SDN approach would allow the internal private cloud operations to communicate with the public cloud operations to statically or dynamically set up these resources as needed, saving time and money.

My Blog List

Networking Domain Jobs