Categories
Articles Wireless - Telecom

Future Cell Site Towers IoT Data, Broadband, Leasing

As demands on the Internet continue to grow, an in-depth look at the future cell site towers needs to be addressed, especially with IoT that sees homes become increasingly “smart” with the demand for transmission equipment continuing to grow. How will this growth happen? Where will transmission towers be located? What are the cost factors and are any innovations likely to come online soon? Are cell towers even going to be needed?

The base for everything on the Internet is power. Something must generate the electricity for transmission, whether through fiber optic lines or radio waves. How much power is not even a question as engineers know exactly how much it takes to send any signal any distance through any medium.

The power needs for individual devices, think smartphone, smart thermostat and such, is tiny. However, the power demands for several of these devices increase. Bump that number to the hundreds and thousands and power demands jump a lot. The future cell site towers is that they are going to need a LOT of power to handle that volume of data traffic.

Simply put, a pocket-sized battery will not deliver the volts and amperage needed to receive and transmit signals from more than 1,000 devices. “Cell towers will become obsolete only when Chevy Suburban’s and Ford F-150’s can drive down the Interstate at 70 MPH fully powered by solar panels made in the USA.  The demand for bandwidth is growing faster than the carriers can sell smart phones. Even if they came up with some amazing technology that could replace cell towers, it would easily take 10 years or more to implement.”   Some may point to signal boosters to handle the need for more and stronger transmissions.

More Power

 

Signal boosters require more power. That must come from somewhere. The demand on the already-stressed power grid will just get worse. Individually, the power draw may be minuscule. Added together, it becomes a real issue. A straw broke the camel’s back.   Battery advances over the past 30 years are huge, but battery output is still directly tied to the size of the battery. You can’t run a golf cart on a dozen D-cell flashlight batteries.

FCC Regulation

 

The Federal Communications Commission controls radio wave broadcasts including that done by wireless devices. It regulates signal boosters now. “Malfunctioning, poorly designed, or improperly installed signal boosters can interfere with wireless networks and result in dropped or blocked calls, including emergency and 911 calls,” says an FCC Consumer Guide to signal boosters.  As more and more devices go wireless, the chances for interference are going to grow.

Future Cell Site Towers in Aesthetic Landscapes

 

The demand for towers is not going away. Vertical Consultants tracks cell tower agreements and reports the industry is growing. “So again, if cell towers were about to become obsolete, why would the industry leaders be investing billions of dollars to acquire the rights to your cell tower?  The answer to this situation is that technology is nowhere near close to finding an economic and reliable replacement for the future cell site towers, and your individual site lease has value to the acquiring company!” .

future-of-cell-site-towers-in-the-city-300x200However, the look and location of these towers is changing. So, a better description for a cell tower is “transmission hub,” or hub for short. Increasingly municipalities are rejecting the look of giant antenna arrays.

The industry is responding. “Cell tower companies like Crown Castle are installing small cells for carriers’ use on light poles, on top of shopping centers and other places where they fit in with the urban scenery. In 2010, Crown Castle acquired New Path Networks, which built the nine-antenna medical center system. Where and what these smaller hubs are might surprise you. Twisted Sifter has a list of these different types of antenna hubs.

These hubs still require space, which means buying or leasing that space. A smaller footprint likely will translate into smaller lease payments, but more hubs also mean more leases. Savvy negotiators are going to win this one.

Future Cell Site Towers gets Creative

 

 

The demands on the wireless networks and high-speed broadband Internet are only going to grow. Consumers have already shown they are willing to pay for the service. Creative thinking will dominate the industry as it moves forward.  ISPs must step up their transmission capabilities. The tower manufacturers are already headed in the right direction with smaller hubs that are not eyesores. With the increase in transmission/reception sites, the demand for real estate to plant these hubs is also going to grow.

Future cell site towers are small hubs, more hubs and hidden hubs are the demands. Companies that make these hubs are in the driver’s seat. They determine the power needs and appearance. Location is going to be set by ISPs or cell companies and real estate owners.

Categories
Articles Wireless - Telecom

Network Function Virtualization What NFV will do for Operators?

Network function virtualization, as Dylan would say, the times they are a changin’.  Network Function Virtualization has come to the mobile operator, and according to strategic business advisor Northstream. It will be part of a “natural evolution of existing infrastructures” bringing greater efficiency and lower costs. But the key will be the creation of new services. “NFV in 2017 will be driven by services such as VoLTE, Carrier Cloud, Wi-Fi calling, service chaining, resource sharing and network slicing.”

Network Function Virtualization, aka NFV, was introduced to the world through a white paper that was delivered at the 2012 SDN and OpenFlow World Congress. Authors from thirteen different telecom providers contributed to the work. The paper highlighted several benefits of NFV, including reduced equipment costs, lower power consumption, faster time to market, scalability of services, and vendor interoperability.

The traditional approach to networking involved the dispatch of personnel, either to the data center or to the customer premises, to install the physical devices and cabling required to make the network services function. This sometimes involved a number of “truck rolls” until the network appliance was fully operational. But an implementation that might have taken weeks or even months through the traditional method might only take a few minutes with Network Function Virtualization.

Common appliances that can be replaced by virtualized network functions (VNFs) in the NFV architecture include routers, firewalls, switches, load balancers, and media servers. Instead of physical installs, Network Function Virtualization software can be used to simply “spin out” new services as needed. As traffic volume increases, the system may automatically create VNFs to meet the demand.

When things slow down, the infrastructure will automatically be reduced. Malfunctioning virtual devices will be detected and traffic will be rerouted through a new VNF created just for that purpose.

Replacing infrastructure is fine, but the real potential is in the expanding service portfolio of the NFV architecture. “By enabling service chaining and resource sharing,” says Northstream, “NFV allows operators to deliver network services to customers and enterprises through software instead of dedicated hardware devices. This represents a major step towards meeting the new demands of industry verticals that are just around the corner.”

 

Network Function Virtualization is not without challenges

 

 

While the hardware part has become simpler – many implementations are using off-the-shelf blade servers – there are still plenty of obstacles to overcome. RCR Wireless News explores the key challenges facing ongoing SDN, NFV and cloud deployment models in an interview with Frank Yue, director of application delivery solutions at Radware.

Yue believes that the biggest issue telecom companies need to deal with is orchestration, the automatic deployment of resources in the cloud. Trying to bring things together is “still very targeted and piecemeal”. Providers seem to be in a rush to bring services to market. “Really to get orchestration and everything right,” says Yue, “you need to have all these tiny projects come together in one big cohesive unit, and I don’t think we’re there yet.”  Real time and automation are the key words, according to RCR Wireless editor Dan Meyer. For Frank Yue, the keys are agility and elasticity, terms associated with cloud computing.

Another major challenge is security. How do you maintain the privacy and integrity of your data across the cloud infrastructure? Industry standards have a bearing on security. Yue calls the situation a “big administrative mess”. Without proper standardization, particularly in multi-tenant environments, the potential for security breaches remains.

 

Network Function Virtualization Standards

 

 

One standards body, the European Telecommunications Standards Institute (ETSI), announced NVF Release 2 on September 27, 2016. The statement includes remarks from Telefonica’s Diego Lopez, the newly appointed Chairman of ETSI NFV ISG: “This represents another major step towards our objective of defining a comprehensive set of specifications that will facilitate the deployment of Network Function Virtualization throughout the telecommunication industry, with significant benefits being subsequently derived in many interrelated sectors.” Lopez says that the ETSI NFV Architectural Framework will form the basis for the security, reliability, and integration of NFV going forward.

Network-Function-Virtualization-300x200How quickly will NFV revolutionize the networks of the world? That remains to be seen. It’s being looked at as a potential framework for 5G mobile deployments. Will service chaining fueled by NFV resources make large-scale network installations a simple point-and-click operation?

How will Network Function Virtualization be used in the development of self-healing networks? What other innovations await us in the field of network virtualization? Get ready, because the virtualized future everyone dreamed about is well-nigh upon us.

 

Does your company plan to deploy NFV any time soon? What do you think about this new technology? How do you think it will affect telecom companies and their customers in the next few years? Please share your comments on Network Function Virtualization below.

 

Expanding NPV services for MNOs

 

Tier 1 and Tier 2 mobile network operators are expanding their 4G services as it is at least 5+ years before 5G networks are ready for early deployment.  ARPU, expanding data services, lowering power consumption – these are all needed to be competitive and maintain a healthy profit ratio.  If you require an expertise recruitment team to fill a key sales or engineering role or perhaps product management or a strategic leader, you can rely upon Nextgen Executive Search to not only meet, but exceed your expectations in delivering a candidate shortlist that is ideal for new hires.  Click the image below for more information on our mobile network, digital media, telecom services, and wireless connectivity recruitment and to contact us directly.

Categories
Articles Artificial Intelligence Wireless - Telecom

Smart Objects: Blending Ai into the Internet of Things

It’s been more than a decade since the time when the number of internet-connected devices exceeded the number of people on the planet. This milestone signaled the emergence and rise of the Internet of Things (IoT) paradigm, smart objects, which empowered a whole new range of applications that leverage data and services from the billions of connected devices.  Nowadays IoT applications are disrupting entire sectors in both consumer and industrial settings, including manufacturing, energy, healthcare, transport, public infrastructures and smart cities.

Evolution of IoT Deployments

 

During this past decade IoT applications have evolved in terms of size, scale and sophistication. Early IoT deployments involved the deployment of tens or hundreds of sensors, wireless sensor networks and RFID (Radio Frequency Identification) systems in small to medium scale deployments within an organization. Moreover, they were mostly focused on data collection and processing with quite limited intelligence. Typical examples include early building management systems that used sensors to optimize resource usage, as well as traceability applications in RFID-enabled supply chains.

Over the years, these deployments have given their place to scalable and more dynamic IoT systems involving many thousands of IoT devices of different types known as smart objects.  One of the main characteristic of state-of-the-art systems is their integration with cloud computing infrastructures, which allows IoT applications to take advantage of the capacity and quality of service of the cloud. Furthermore, state of the art systems tends to be more intelligent, as they can automatically identify and learn the status of their surrounding environment to adapt their behavior accordingly. For example, modern smart building applications are able to automatically learn and anticipate resource usage patterns, which makes them more efficient than conventional building management systems.

Overall, we can distinguish the following two phases of IoT development:

  • Phase 1 (2005-2010) – Monolithic IoT systems: This phase entailed the development and deployment of systems with limited scalability, which used some sort of IoT middleware (e.g., TinyOS, MQTT) to coordinate some tens or hundreds of sensors and IoT devices.
  • Phase 2 (2011-2016) – Cloud-based IoT systems: This period is characterized by the integration and convergence between IoT and cloud computing, which enabled the delivery of IoT applications based on utility-based models such as Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). During this phase major IT vendors such as Amazon, Microsoft and IBM have established their own IoT platforms and ecosystems based on their legacy cloud computing infrastructures. The latter have alleviated the scalability limitations of earlier IoT deployments, which provided opportunities for cost-effective deployments. At the same time the wave of Big Data technologies have opened new horizons in the ability of IoT applications to implement data-driven intelligence functionalities.

 

AI: The Dawn of Smart Objects using IoT applications

 

 

Despite their scalability and intelligence, most IoT deployments tend to be passive with only limited interactions with the physical world. This is a serious set-back to realizing the multi-trillion value potential of IoT in the next decade, as a great deal of IoT’s business value is expected to stem from real-time actuation and control functionalities that will intelligently change the status of the physical world.

Smart-Objects-blending-Ai-into-IoTIn order to enable these functionalities we are recently witnessing the rise and proliferation of IoT applications that take advantage of Artificial Intelligence and Smart Objects.  Smart objects are characterized by their ability to execute application logic in a semi-autonomous fashion that is decoupled from the centralized cloud.

In this way, they are able to reason over their surrounding environments and take optimal decisions that are not necessarily subject to central control. Therefore, smart objects can act without the need of being always connected to the cloud. However, they can conveniently connect to the cloud when needed, in order to exchange information with other passive objects, including information about their state / status of the surrounding environment.

Prominent examples of smart objects follow:

  • Socially assistive robots, which provide coaching or assistance to special user groups such as elderly with motor problems and children with disabilities.
  • Industrial robots, which complete laborious tasks (e.g., picking and packing) in warehouses, manufacturing shop floors and energy plants.
  • Smart machines, which predict and anticipate their own failure modes, while at the same time scheduling autonomously relevant maintenance and repair actions (e.g., ordering of spare parts, scheduling technicians visits).
  • Connected vehicles, which collect and exchange information about their driving context with other vehicles, pedestrians and the road infrastructure, as a means of optimizing routes and increasing safety.
  • Self-driving cars, which will drive autonomously with superior efficiency and safety, without any human intervention.
  • Smart pumps, which operate autonomously in order to identify and prevent leakages in the water management infrastructure.

The integration of smart objects within conventional IoT/cloud systems signals a new era for IoT applications, which will be endowed with a host of functionalities that are hardly possible nowadays. AI is one of the main drivers of this new IoT deployment paradigm, as it provides the means for understanding and reasoning over the context of smart objects. While AI functionalities have been around for decades with various forms (e.g., expert systems and fuzzy logic systems), AI systems have not been suitable for supporting smart objects that could act autonomously in open and dynamic environments such as industrial plants and transportation infrastructures.

This is bound to change because of recent advances in AI based on the use of deep learning that employs advanced neural networks and provides human-like reasoning functionalities. During the last couple of years we have witnessed the first tangible demonstrations of such AI capabilities applied in real-life problems. For example, last year, Google’s Alpha AI engine managed to win a Chinese grand-master in the Go game. This signaled a major milestone in AI, as human-like reasoning was used instead of an exhaustive analysis of all possible moves, as was the norm in earlier AI systems in similar settings (e.g., IBM’s Deep Blue computer that beat chess world champion Garry Kasparov back in 1997).

Implications of AI and IoT Convergence for Smart Objects

 

This convergence of IoT and AI signals a paradigm shift in the way IoT applications are developed, deployed and operated. The main implications of this convergence are:

  • Changes in IoT architectures: Smart objects operate autonomously and are not subject to the control of a centralized cloud. This requires revisions to the conventional cloud architectures, which should become able to connect to smart objects in an ad hoc fashion towards exchanging state and knowledge about their status and the status of the physical environment.
  • Expanded use of Edge Computing: Edge computing is already deployed as a means of enabling operations very close to the field, such as fast data processing and real-time control. Smart objects are also likely to connect to the very edge of an IoT deployment, which will lead to an expanded use of the edge computing paradigm.
  • Killer Applications: AI will enable a whole range of new IoT applications, including some “killer” applications like autonomous driving and predictive maintenance of machines. It will also revolutionize and disrupt existing IoT applications. As a prominent example, the introduction of smart appliances (e.g., washing machines that maintain themselves and order their detergent) in residential environments holds the promise to disrupt the smart home market.
  • Security and Privacy Challenges: Smart objects increase the volatility, dynamism and complexity of IoT environments, which will lead to new cyber-security challenges. Furthermore, they will enable new ways for compromising citizens’ privacy. Therefore, new ideas for safeguarding security and privacy in this emerging landscape will be needed.
  • New Standards and Regulations: A new regulatory environment will be needed, given that smart objects might be able to change the status of the physical environment leading to potential damage, losses and liabilities that do not exist nowadays. Likewise, new standards in areas such as safety, security and interoperability will be required.
  • Market Opportunities: AI and smart objects will offer unprecedented opportunities for new innovative applications and revenue streams. These will not be limited to giant vendors and service providers, but will extend to innovators and SMBs (Small Medium Businesses).

Future Outlook

 

AI is the cornerstone of next generation IoT applications, which will exhibit autonomous behavior and will be subject to decentralized control. These applications will be driven by advances in deep learning and neural networks, which will endow IoT systems with capabilities far beyond conventional data mining and IoT analytics. These trends will be propelled by several other technological advances, including Cyber-Physical Systems (CPS) and blockchain technologies. CPS systems represent a major class of smart objects, which will be increasingly used in industrial environments.

They are the foundation of the fourth industrial revolution through bridging physical processes with digital systems that control and manage industrial processes. Currently CPS systems feature limited intelligence, which is to be enhanced based on the advent and evolution of deep learning. On the other hand, blockchain technology (inspired by the popular Bitcoin cryptocurrency) can provide the means for managing interactions between smart objects, IoT platforms and other IT systems at scale. Blockchains can enable the establishment, auditing and execution of smart contracts between objects and IoT platforms, as a means of controlling the semi-autonomous behavior of the smart object.

This will be a preferred approach to managing smart objects, given that the latter belong to different administrative entities and should be able to interact directly in a scalable fashion, without a need to authenticating themselves against a trusted entity such as a centralized cloud platform.

In terms of possible applications the sky is the limit. AI will enable innovative IoT applications that will boost automation and productivity, while eliminating error prone processes.  Are you getting ready for the era of AI in IoT?

 

Categories
Articles Wireless - Telecom

Self Organizing Networks Driving down HetNet cost

As technology continues to advance, Self organizing networks drive down mobile Hetnets cost.  Known as SON, it has promise for the large cellular carriers that run LTE but additionally for smaller networks running on Wi-Fi and femtocells.  The goal of all carriers is to lower their overall operating costs and increase cost effectiveness.  Should SON use HetNet (heterogeneous networks), there are some advantages and disadvantages.  This article will look at each side.

Self Organizing Networks Advantages

 

The innate autonomous SON can function without users.  This means base stations and access points are configured and optimized automatically.  Macrocells still require technical interface, but the advent of self organizing networks within combination of small-cell technology meant a powerful shift in resource management – there was no reason to send a technician to each new small cell in a selected market area.

Ultimately, and the technology is still in its infancy if not even created, is to have the Self organizing networks implemented in the RAN.  The autonomous nature of SON means no human intervention for organizing and optimizing.  All a carrier would need is to create the cell site. The SON would handle all the RF frequencies and their channels, determine power levels, lists of neighboring elements and the other necessary configurations which historically required input.

In a sample case: a cell site within a SON-capable network goes down in an act of nature or accident.  The sites around the downed cell immediately and automatically organize themselves to provide coverage for the affected area. This gives carriers time to make logistical decisions or wait until normal working hours to dispatch technician for repairs. Clearly, the self-operating and repairing functions of the self organizing network have clear profitability for carriers.  This includes the larger service carriers and smaller ones who depend on communication besides LTE.

 

Self organizing Network Disadvantages

 

While the innate abilities of the self organizing networks to make the necessary reconfigurations to neighboring cells, the surrounding network within the down cell’s immediate area may prove difficult because of the SON’s sensing abilities.  There are two potential scenarios: a SON-capable base station passively finds and configures, or the information can come from queries around neighboring stations.

self-organizing-networks-for-HetNet

Here is the fundamental issue.  Base stations are uplink only; they receive transmissions from the carriers to the network, but stations additionally must be able to receive downlink signals for levels and neighboring parameters.

This means SON-capable stations must be frequency-agile for both links.

Receivers, set only for dedicated downlink and time-division-duplexed, TDD, systems, mean a SON-capable station will require time when it can receive downlink transmissions, a situation that can lead to unintended additional downtime during the process.

The disadvantages are not as harsh as they may seem.  So long as the SON is a part of a HETNET, the cost can be kept to a minimal amount.  Here is how. The HETNET is a web of base stations and wireless, up to and including macro stations, small cells, the preferred element of SON, and Wi-Fi.  The largest cellular carriers use HETNET in large, metropolitan areas (think New York, Los Angeles, Chicago, etc.) because of the user saturation macro base stations are grossly ineffective.

However, the needs of the many mean all markets, even smaller, rural ones, will eventually have a need for HETNET.  This means all the intricacies become of critical need for carriers in all wireless markets and mobile networks.

This is where self organizing networks is so important.  It is one technology that will meld the small and macrocells while providing a superior user experience for the carriers and their customers.  Expect the SON to evolve dramatically in the coming years. Some major U.S. carriers have plans to expand from 100,000 active sites to over 500,000.  This massive growth will require SON with the HETNET.

Initial upfront costs are a concern for some smaller carriers, but the long-term savings on technicians more than offsets the initial investment.  It should go without saying the profit margins will take a dip on the front end but will rapidly recover as self organizing networks saturation increases.  Success will depend on all the previous factors and full implementation with proper logistical planning.

Based on this, what is your opinion?  Is the potential upfront cost and dip in profits advantageous in the overall scope of the business or is the on-call skilled technician a safer and more dependable alternative?  Certain factors certainly must be considered on both, but exactly what are those factors outside of forces of nature?  Feel free to provide your personal thoughts on this.

Categories
Articles Wireless - Telecom

Zero Rating for Broadband and Mobile Operators

A report on zero rating by the Federal Communications Commission just a week and a half before the inauguration of Donald Trump said that zero rating for broadband and mobile network operators violates net neutrality rules. “Zero-rated” applications do not count toward data caps or usage allowance imposed by internet service providers. Forbes staff writer Parmy Olson called the report “too little too late”.

Zero rating has come under fire from many quarters. “While network capacity could become a problem if zero-rated offerings truly take off,” writes Colin Gibbs in a review of 2016 for Fierce Wireless, “the biggest challenge to the model has been claims that it’s a threat to net neutrality rules.”  Last year, Verizon began offering zero rated video streaming though NFL Mobile app.

 

Keeping the Net Neutral

 

The idea of net neutrality is that everything on the internet should be treated openly and fairly. Net neutrality prohibits blocking of sites by ISPs. It prohibits throttling:  ISPs should not slow down or speed up content for different services. It calls for increased transparency and prohibits paid prioritization of traffic. Before the recent FCC report, sponsored data plans – plans with zero rating – were to be judged by the agency on a case-by-case basis.  NextGen’s wireless practice has 22+ years working in these types of telecom market movements and standards.

 

Zero Rating for Broadband and Mobile Network Services

 

 

Facebook offers free internet access to underdeveloped countries with curated content. According to Internet.org, “Free Basics by Facebook provides people with access to basic websites for free – like news, job postings, health and education information, and communication tools like Facebook.” The motto of the service is “Connecting the World”.

A number of mobile network providers have taken up the practice. The first to try zero rating was T-Online with their Music Freedom offering in 2014. They followed that up with a video service called Binge On. Verizon came up with their own mobile video service called Go90. Perhaps the most aggressive has been AT&T’s partnership with DirecTV.  Virgin Mobile 4G Plans Now Allow Free Zero Rated Data Use on Twitter.

zero-rating-for-ISPs-and-mobile-networks

Presenting the case against zero rating for broadband and mobile network operators services, the young Mike Egan stated articulately in a YouTube video: “Zero rating isn’t about giving online services or online creators a chance. It’s about mobile carriers finding a loophole so that they can keep you even more locked into what easily becomes their new media ecosystem.”

He says that “certain services are privileged over others” and that it is one of the best ways to “kill a free and open internet”.

Egan and others like him are upset, and he talks in terms of “the oppressor” versus “the oppressed”.   The Federalist Society takes a different view. In their YouTube video about zero rating, they compare it to getting free samples of ice cream. “This is a way to increase the adoption of the internet,” the spokeswoman says. “All that zero rating is doing is helping to increase the competition and expanding the user choice.”

 

The Less Regulated Road Ahead

 

The “too little too late” remark of the Forbes staffer is all about the new political realities in America. Despite the recent pronouncement again zero rating by the FCC, chances are the practice will continue unabated. President Trump has vowed to cut government regulations by 75%, and the new FCC chairman Ajit Pai will likely tamp down any opposition to zero rating for ISPs and mobile network operators.

A blog post from CCS Insight says, “Mr. Pai had opposed government intervention in the telecommunications market and has been an open critic of an FCC report disapproving of zero-rating data, also known as toll-free data….” The blogger goes on to say that there will certainly be a rise in the number of toll-free data offers.

 

Conclusion on Zero Rating for Broadband and Mobile Services

 

Many are concerned about the potential loss of internet freedom with zero rating. As Egan put it, “It’s a war for the future of our media landscape.” How that war plays out when deregulation sets in remains to be seen. Neutrality is a hard thing to maintain.     What are your ideas on zero rating?  Does your network provider bundle any of these services? How do you think it will affect the future of the internet? Please add your comments below.