Africa’s Big Data Future – Innovation Needed To Lower Internet Access Costs, Says Russell Southwood

Posted On // Leave a Comment
Africa’s #BigData Future – Innovation Needed To Lower Internet Access Costs, Says Russell Southwood
Africa’s data and Internet communications infrastructure has improved so much in the last decade that it’s easy to become complacent. The dual challenges of price and quality of service have not been overcome. African regulators have never been good at imagining the future and with all the improvements they seem to have taken their eye off the ball. Russell Southwood looks at why things are stuck and what might get them moving again.
Africa’s transition to data is crucial for the next round of investment in the continent. The existence of relatively cheap Internet access and the services and content it brings with it are needed to power a second wave of economic growth.

The challenge of all challenges is that the operators in Africa’s Internet market need to be able to deliver cheaper data access than elsewhere because most Africans do not earn US or European salaries. Getting Internet access prices to US or European levels is not enough, they have to go lower.
Sub-Saharan Africa has countries that are amongst the most expensive places to operate in. So whoever Africa’s operators are or will be, they have to become pioneers in lowering the costs of both building and operating data infrastructure. Government and regulators need to understand the scale of this challenge and help operators become cost-cutting pioneers.


Need: An Effective, Large Capacity Data Network




African regulators are not currently in a good place to make this happen. In the main, mobile operators have taken over as the market incumbents and are no longer forcing the pace of change but largely simply reacting to what’s happening elsewhere. The need for an effective, large capacity data network seems to have caught many of them off-balance.
African regulators who have pursued the opening of African telecoms markets have been slow to react to changed market circumstances. In many cases, even what were once quite competitive markets are now stuck. A brief summary of some of the difficulties may be helpful:

* Dominant Players: MTN, Sonatel, Safaricom


A number of Africa’s telecoms markets are now dominated by a single player. Sometimes efforts have been made to declare them dominant players in regulatory terms but these have largely been ineffective. These dominant players have operated skillfully as price progressives (lowering tariffs) and in so doing have cemented their market position.
In Kenya, Safaricom has such an unassailable and central position that whatever anybody does, it usually ends up benefitting. If you need a telecoms or network partner, why bother with those who have less than 30% of the market? The recent Kenya Power partnership deal on Fibre-To-The-Home will reinforce its position in yet another market niche.
In Senegal, Sonatel is testing Free-Wi in Rufisque. Free Wi-Fi is undoubtedly a good thing but who’s paying? And this in a country where there are no independent Internet service providers, no alternative fibre providers and its two mobile competitors struggle to make it anything like a fair, competitive fight.
And then there’s MTN in Nigeria who for all their recent troubles, occupy the commanding heights of the country’s telecoms and data markets…..Others could be added to list.

* Old School Stuck: Decaying State-Owned Telco's



Some of Africa’s regulators have simply not got off the blocks in terms of creating a competitive market. In these countries the dominant player is usually a decaying state-owned telco with about as much appetite for innovation as a sleeping dog. The Governments in these countries have chosen to foster an inefficient job creation scheme over being able to offer cheaper Internet and a more efficient economy.
The country that heads this category must surely be Ethiopia where the absence of any competition means that the market is probably about a third smaller than it might otherwise be. The State lacks the capital to make these financially leaky dinosaurs effective. But in this long list of countries, we must include places like Djibouti, Togo and Cameroon
Take Cameroon where the Government has ensured that it has ownership of all the landing stations and its telco Camtel has a de facto monopoly over wholesale bandwidth. And all of these dilapidated incumbents who are without strategy, innovation or ideas want to be mobile operators.

* Last Man Standing Theory“Consolidation”



More conventional industry analysts are keen on seeing “consolidation” as one answer to current problems. To be fair to their argument, Africa has more operators than many other places globally.
But what does consolidation mean? There will be less operators (two per country?) and their market power will be even greater. The last man standing theory is that if you are one of the lucky surviving operators you will then be able to hike your prices back to the level they were when the markets first opened.
With Airtel selling off some of its smaller opcos to Orange, the full scale of this is not immediately apparent. But what if this was really just the start of Bharti Airtel’s long goodbye to Africa? Consolidation will inevitably happen but then how do regulators ensure that markets maintain a competitive dynamic?

* No Technical or Business Model Innovation


Many African regulators were admirably quick to allow TV White Spaces Pilots. All of these worked at a technical level and offered independent operators wanting to deliver Internet more widely distinct advantages. But with one exception, not a single African regulator has licensed an operator to use TV White Spaces.
So whilst Africa has a massively improved network infrastructure many of same market blockages remain. What follows is a list of things that might be done by African regulators to achieve two things: firstly, to help existing operators improve their cost base; and secondly, to encourage new market players to invest in business models that will offer cheaper operating costs:



Open up competition in the wholesale space by licensing alternative fibre providers
There are still African countries where there are public utilities (railways, water, oil and electricity companies). Competition and price levels in the wholesale market for those countries that have state incumbents will be greatly improved by these entities offering their surplus fibre capacity.
In addition, there are independent wholesale fibre providers (Liquid Telecom, Phase3, FibreCo) who are operating fibre networks that provide competition to existing providers. These kinds of third party fibre providers can be encouraged by offering licences for them to invest.
Take a leaf out of the Project Link book and open up licences for metronet providers
Google’s Project Link has demonstrated that it’s possible to create a metronet fibre provider that can both offer better prices and invest in metronet fibre that is more open access. Regulators need to learn lessons from its experiences in Uganda and Ghana and look out how they might encourage this kind of entity in their own country.
Encourage the sale of dark fibre by all market players to make maximum use of fibre assets available
In many countries, there is no shortage of fibre but prices remain unrealistically high. Regulators need to insist that all operators with wholesale fibre make a dark fibre offer. Where there is only one operator, prices will need to be controlled by the regulator. These dark fibre prices can be benchmarked against the more competitive markets across the continent.
Making it easier and quicker to build fibre infrastructure
African regulators and Governments need to ensure that the sale and administration of rights of way are dealt with by a single agency and that local administrations understand that getting fibre networks built is of national strategic importance, not another source of tax income.
Existing and planned ducts should all be available on a shared basis so that several operators can make use of a single physical infrastructure element. Building fibre ducts should be a mandated part of all major road or rail projects.
Taxes – Stop taxing Internet access devices (smartphones, tablets and laptops)
On the tax front, some African countries have removed import taxes from handsets. All countries should do this and include tablets and laptops in the same exemption. As the cheaper smartphones now in the market have shown, getting the cost of devices down does increase number of devices sold.
Licence power distributors who will supply power to any base station operator to lower operating costs
Companies should be licensed to distribute electricity to anyone operating a base station or tower. Facilitating reliable electrical power to base stations will improve quality of service levels and lower energy costs for operators.
Encourage innovation in both technologies and business models
Regulators and governments need to actively encourage innovation that will lower operating costs and in so doing lower the final price of Internet access to users. This encouragement needs to focus on two areas: firstly, they need to encourage the roll-out of innovative technologies that will help change capital and operating costs. Secondly, they need to license companies or organizations offering new business models that will do the same.

White Spaces and Millimetre Bands




In technology terms, there is everything from near technologies (already operating) like TV White Spaces and Millimetre Band to future technologies like lasers and drones. A few weeks ago Facebook announced its OpenCellular project, the equivalent of a network in a box. African regulators should be running pilots of technologies like this with the clearly announced intention of licensing operators to use it at the end of a pilot period.
In terms of new business models, regulators should be encouraging any one of a growing list of companies (including Argon, Mawingu, Vanu and Virural) to start rolling out to uncovered rural areas. Some of these new licensees could work directly with mobile operators whereas others could work independently. But each of these categories of licensees would provide coverage (voice and data) for those areas currently uncovered. Closer to the core, regulators need to find operators who will provide investment in the latest VoLTE technology or any other IP-based voice technology.
African regulators and Government need to start searching out those who can deliver innovations of these kinds and put out the welcome mat for them. If they do so, Africa may yet become a pioneering continent in terms of cheaper Internet access.
[Read more]

6 Myths About-Big Data Debunked

Posted On // Leave a Comment


Summary                                                                             

Is your company still struggling to understand what big data is, and how to manage it? Here are 6 myths about big data, from the experts, to help you separate truth from fiction in the realm of big data.


Definition:

Big data is a term for data sets that are so large or complex that traditional data processing applications are inadequate. Challenges include analysis, capture,data curation, search, sharing, storage, transfer, visualization, querying, updating and information privacy.



Advances in cloud computing, data processing speeds, and the huge amount of data input from sources like IoT mean that companies are now collecting previously unseen amounts of data. Big data is now bigger than ever. But organizing, processing, and understanding the data is still a major challenge for many organizations.





The Myths

1. Big data means 'a lot' of data

2. The data needs to be clean

3. Wait to make your data perfect

4. The data lake

5. Analyzing data is expensive

6. Machine algorithms will replace human analysts




1. Big data means 'a lot' of data

Big data is a buzzword these days. But what it really means is still often unclear. Some people refer to big data as, simply, a large amount of data. But, that's not quite correct. It's a little more complex than that. Big data refers to how data sets, either structured (like Excel sheets) or unstructured (like metadata from email) combine with data like social media analytics or IoT data to form a bigger story. The big data story shows trends about what is happening within an organization—a story that is difficult to capture with traditional analytic techniques.
Jim Adler, head of data at Toyota Research Institute, also makes a good point: Data has a mass. "It's like water: When it's in a glass, it's very manageable. But when it's in a flood, it's overwhelming," he said "Data analysis systems that work on a single machine's worth of data will be washed away when data scales grow 100 or 1000 times. So, sure, prototype in the small, but architect for the large."


2. The data needs to be clean

"The biggest myth is you have to have clean data to do analysis," said Arijit Sengupta, CEO of BeyondCore. "Nobody has clean data. This whole crazy idea that I have to clean it to analyze doesn't work. What you do is, you do a 'good enough' analysis. You take your data, despite all the dirtiness, and you analyze it. This shows where you have data quality problems. I can show you some patterns that are perfectly fine despite the data quality problems. Now, you can do focused data quality work to just improve the data to get a slightly better insight."
Megan Beauchemin, director of business intelligence and analytics for InOutsource, agreed. "Often times, organizations will put these efforts on the back burner, because their data is not clean. This is not necessary. Deploying an analytic application will illuminate, visually, areas of weakness in data," she said. "Once these shortfalls have been identified, a cleanup plan can be put into place. The analytic application can then utilize a mechanism to highlight clean-up efforts and monitor progress."
"If your data is not clean, I think that is all the more reason to jump in," Beauchermin said. "Once you tie that data together, and you're bringing it to life visually in an application where you're seeing those associations and you're seeing the data come together, you're going to very quickly see shortfalls in your data." Then, she said, you can see where the data issues lie, offering a benchmark as you clean the data up.

3. Wait to make your data perfect


Here's another reason you shouldn't wait to clean up your data: "By the time you've cleaned your data, it's three months old—so you have stale data," said Sengupta. So, the information is no longer relevant.
Sengupta spoke about a conference where Josh Bartman, from the First Interstate Bank, brought up an important point. "Josh showed how he was running an analysis, finding a problem, changing the analysis, rerunning the analysis. He said, 'Look, my analyses are only about four to five minutes apart. So, if I can run an analysis, find the problem, fix the problem, rerun the analysis and see the report in four or five minutes later, that changes the nature of how I approach analysis.'"
Sengupta compared it to the old way of coding. "I get everything right, then I code. But now, everybody does agile coding," he said. "You write something, you test it, you see how you can make it better, then you make it better. The world has changed and people are still acting like it's the old way of doing things."

4. The data lake


Data lakes, which are, loosely, storage repositories holding large amounts of raw structured and structured data, are frequently referred to in the context of big data.
The only problem is, despite how often they are cited, they don't exist, Adler said."An organization's data isn't dumped into a data lake. It is carefully curated in a departmental 'data silo' that encourages focused expertise. They also provide the accountability and transparency needed for good data governance and compliance."

5. Analyzing data is expensive


Are you afraid to get started on the data because of the presumed expense involved in data analysis tools? There's good news for you: With the free data tools available today, anybody can get started with analyzing big data.
Also, according to Sengupta, the low cost of today's cloud computing means "you can actually do things that were never possible."

6. Machine algorithms will replace human analysts

Sengupta sees an interesting dichotomy in terms of approaches to analyzing big data. "There's a split, where on one side there are people who are saying, 'I'm going to throw thousands of data scientists at that problem.' Then, there are people who are saying, 'Machine learning is going to do it all. It's going to be completely automated,'" he said.
But, Sengupta doesn't think either of those solutions work. "There aren't enough data scientists, and the cost is going up fast," he said. "Also, business users have years of domain log-ins and intuition about their business. When you bring a data scientist in and say, 'That guy's going to do it and tell you what to do,' that actually creates the exact wrong kind of friction which prevents adoption of those insights. Data scientists often can't learn enough about our business to be really smart about the business immediately."
The "perfect" data scientist, who understands exactly how a specific business works, how its data works, is a myth, said Sengupta. "That person doesn't exist."
In reality, Sengupta said, "most data science projects actually don't get implemented because it's so hard. It takes months to get done and, by the time it's done, the question you care about is already too old."

But, there are also problems with relying too heavily on the machine learning. "It's giving me an answer but not an explanation. It's telling me what to do, but not why I should be doing it," he said. "People don't like being told what to do, especially by magical machines."
The key, he said, is not just the answers—it's the explanations and recommendations.
On one hand, he said, data scientists will become more and more specialized on the really hard problems. "Think the time when every department and company started a data processing department and number processing department. Fortune 500 companies had 'Data Processing Departments' and 'Number Processing Departments.' They basically became Excel, Word, and PowerPoint."
Still, people are experts in data and number processing.
"If I go to Morgan Stanley, believe me, there are still people who are experts in data processing and number processing. They still exist. They have different titles and different jobs but, in really advanced cases, those people still exist. But 80-90% will have moved to Excel, Word, and PowerPoint. That's how the world, in terms of big data, should evolve."
Courtesy: TechRepublic
[Read more]

Photo Of Eric Gethi Kanyingi, Force Number 97861, Mastermind Of The Planned Attack On Ruiru GSU Recce Company Headquarters, Kenya

Posted On // Leave a Comment



Eric Gethi Kanyingi, 26, Force Number 97861, GSU Recce Squad Kenya, Went AWOL 2014


Summary

 Eric Gethi Kanyingi, a 26 year old, Elite Kenyan Paramilitary GSU Recce ex-officer who is a highly trained hostage and rescue officer, was reported to have been planning a terror attack at the GSU Ruiru RECCE, Kiambu County, headquarters among other targets. He joined the unit in 2013 and is Force Number 97861, he went AWOL after two years in the service.


Githurai 45, Nairobi, Kenya - Thursday, 4th August 2016 - The Kenyan police on the morning of Thursday, August 4, intercepted weapons belonging to an ex-RECCE squad officer at raid in Githurai 45




Map Of Githurai 45 Where Ex GSU Recce Officer, Eric Gethi  Kanyingi Lived



In the wee hours of  Thursday, August 4 the Kenyan police intercepted three AK 47 rifles and 178 bullets. They belonged to a 26-year-old  former RECCE squad officer  who is suspected  to have been radicalised by the militant Islamic group al-Shabaab.

The police who were responding to a tip-off  that the officer was planning an attack on among other targets, the  GSU RECCE headquarters in Ruiru, Nairobi.

The ex-cop who is said to be a highly trained hostage and rescue officer managed to escape in the dawn attack, leaving behind, his weapons.

  Weapons Cache


  • 3 AK 47 Rifles 
  • 178 Bullets

Weapons Cache Recovered From Eric Gethi Kanyingi's House in Githurai, 45 


The weapons which were previously sewn inside a mattress, TUKO.co.ke has learnt, were recovered dumped away from the house in a sack near the Ruiru river.

The suspect reportedly dumped the weapons after learning that detectives were on his trail.
The suspect was identified in police documents as police constable Eric Gethi Kanyingi, Force Number 97861 who went AWOL from the RECCE squad in 2015.

Detectives  from the Ruiru police station carried out a raid of his Githurai 45 house at the dawn of Thursday, August 4, where they recovered  literature material that suggests that the former officer had been radicalised.

Westgate  Terror Attack

A Member of Kenya's Elite Paramilitary Units, Escorts Hostages Out Of  Nairobi's WestGate Mall



Part of the materials he was reading were newspaper excerpts of the WestGate  terror attack which claimed hundreds of lives- an attack that the al-Shabaab claimed responsibility.

The officers who conducted the raid told TUKO.co.ke that the former officer was previously sighted at the Riyadh Mosque which has been in the limelight for allegations radicalisation.

George Kinoti, the police spokesman has  said that officers had been deployed to hunt down  Kanyingi who is a well-known criminal.

The raid came in the wake of police reports that key officers in the Kenyan police force had been radicalised and were working in cahoots with criminals.

Last month, a radicalized officer opened fire on his colleagues at the Kapenguria Police Station killing 7 people in the 8-hour attack.

Courtesy of Tuko

[Read more]

Dawn Of The Planet Of The Apes (2014): The Third ( And Hopefully Final) Instalment Of The Planet Of The Apes Trilogy.

Posted On // Leave a Comment
Dawn Of The Planet Of The Apes: The Third ( And Hopefully Final) Instalment Of The Planet Of The Apes Trilogy.


The planet of the apes franchise has released it latest offering the Dawn of The Planet of The Apes. The film premiered on Thursday 10th July and so far it promises to be a resounding success having generated sales in the region of $ 4.1 million that’s according to the Internet Movie Database, IMDb. The movie opened in 2750 theatres in North America on Thursday night.

Dawn of the Planet of the Apes (2014) follows in the tradition of the other two instalments Rise of the Planet of the Apes (2011) and The Planet of the Apes ( 2001). For those of you not in the know the series is a science fiction adventure that follows the antics of Caesar the Ape.

Caesar the Hyper Intelligent Ape

Caesar the ape mutated into a hyper intelligent being after receiving a shot of Gen Sys vaccine, ALZ-112, designed to cure neurological disorders such as Alzheimer’s disease. At the risk of being a major spoiler, you really need to watch the two preceding movies.

Caesar undergoes harassment at the hands of the zoo keepers after being ejected from the house of Dr Will Rodman who raised him, after noticing his talents.

 Caesar leads a revolt in the zoo releasing all the other primates at the zoo this after, he resorts to bipedalism and human speech. The plot picks up from the last installment Rise of the Planet of the Apes (2011) where Caesar the ape has now morphed into a fully fledged human (read hyper intelligent being) bent on revenging the misdeeds that were meted out on him, his mum and other apes in captivity by humans.

The Dawn of The Planet of The Apes- 2014 Summer Blockbuster.



The Dawn of The Planet of The Apes is proving to be a real summer blockbuster going by the rave reviews it is already receiving from social media barely 24 hours after it premiered on the big screen in the USA. This movie is big on 3D animation as evidenced by the 350 and 360 large screen format the picture has been released in.


 I have watched the movie and I must say it is not as good as the first installment Planet of the Apes though the SFX are mindboggling. I strongly recommend this movie as it promises to be the biggest release for the summer of 2014.

Mdaku.
[Read more]

SEO Article Writing’s Impact on Web Page Ranking

Posted On // Leave a Comment

Search Engine Optimization, SEO, is the use of methods, techniques and processes aimed at improving the visibility of website to both the human audience and the search engine robots that crawl and index web pages ranking them with respect to their structure and their content. It uses natural or organic means to achieve high rankings in the search engine results, SERPs, served up by a search engine with SEO article writing topping the list and other methods such as Search Engine Marketing, SEM.

SEO article writing is therefore by extension the craft of writing original, informative, captivating easy-to-read content for your readers and using keywords in the right quantity to ensure that your content is amenable to both your human readers and the search engine crawlers. In this respect keywords are the words or phrases that aptly and concisely describe what that particular web page is talking about, it is standard practice for instance to include no more than 10 keywords in a 500 word article and to ensure that the keywords have been inserted in a most delicate manner such that the reader does not feel unduly overwhelmed by the phrase.

SEO Article Writing a Cheap Way to Top The Rankings.

Since SEO article writing’s key purpose is to ensure that your content gets maximum visibility in the search engine results pages, SERPs, of search engines meaning that you have the largest number of people possible interacting with your content then you need a few tools to measure the success of SEO article writing on bringing visitors to your website. There are a number of such tools available both for free and at fee for your use. You can access this tools by typing the keyword, keyword optimization on your search engine search bar. Your search engine will give you links to online tools such as keyword density optimizer that will suggest the key words to use for some given text.

These tools range from the simple, SEO article writing effectiveness measurement software;  that is those that will suggest simple keywords to the complex, those that will tell you how the keywords that people use to reach your website, that is your keyword referrals. From these you can tailor your content to ensure that you satisfy the needs of the people keying in those keywords by using good SEO article writing practices. You can ensure that all your content matches up with those specific key words that are “hot” for your specific content. 


If you are a do it yourself, DIY, bluff you can go a step further and use Google’s advanced tools to find out exactly what keywords people are typing in at any given moment tailoring your content to match up with those terms. SEO article writing, a sub set of organic SEO,  can be a cheap and effective way of ensuring that your content is right at the top of all major search engines with minimal expense on your part.
[Read more]