The cost and complexity of Hadoop keep people from achieving anything, according to big data experts Hadoop, the open source big data framework first developed at Yahoo for analyzing large data sets, is a total failure that costs too much and is too much of a headache to implement, say people in the field. In a lengthy and in-depth piece on Datanami, big data experts describe Hadoop as too primitive for any kind of complex processing work or interactive, user-facing applications. At best, it’s a batch process job, which is how Hadoop started out. It doesn’t seem to have grown beyond it. “I can’t find a happy Hadoop customer. It’s sort of as simple as that,” Bob Muglia, CEO of Snowflake Computing, told Datanami. Snowflake develops and runs a cloud-based relational data warehouse product. “It’s very clear to me, technologically, that it’s not the technology base the world will be built on going forward.” Hadoop isn’t going away overnight Hadoop is pretty widely used, so it won’t go away overnight, but it won’t see any new deployments, either. Muglia says new advances, such as use on Amazon S3 for storage and Apache Spark for real-time in-memory processing, will relegate Hadoop to niche and legacy statuses going forward. “The number of customers who have actually successfully tamed Hadoop is probably less than 20, and it might be less than 10,” Muglia told Datanami. “That’s just nuts given how long that product, that technology has been in the market and how much general industry energy has gone into it.” The article points out that Facebook was one of the bigger Hadoop deployments, but the company no longer uses it because “there’s a bunch of things that people have been trying to do with it for a long time that it’s just not well suited for,” according to an ex-Facebook staffer. Muglia says the problem is the community around Hadoop is small and didn’t really make much of an effort to grow the product. Alternatives have emerged, not the least of which is the public cloud. We’ve even seen the rise of big data as a service. So, the Hadoop team has to make a decision. Do they do the heavy lifting required to bring the software up to speed and make it easier to use and more widely usable, which were the two big knocks in the Datanami piece, or do they just maintain what they have and leave the work to newer technologies? Related content news analysis What is SONiC and how can enterprises try the open-source NOS? Software for Open Networking in the Cloud (SONiC) is an open-source networking operating system that has the potential to become 'the Linux of networking,' experts say. By Michael Cooney Jul 17, 2024 12 mins Open Source Networking news analysis OpenELA group pushes for bug compatibility with RHEL The world of enterprise Linux developers has been roiled of late by a dispute between Red Hat and several companies that sell support for similar distributions. By Jon Gold Aug 24, 2023 4 mins Red Hat Linux Open Source news Red Hat rivals form Open Enterprise Linux Association Oracle, SUSE and CIQ will form the Open Enterprise Linux Association, in response to changes at Red Hat that have upset the enterprise Linux ecosystem. By Jon Gold Aug 11, 2023 3 mins Linux Open Source news analysis SONiC test lab gains industry support Open Network Experience Center (ONE Center) for SONiC is being run by Aviz Networks with support from Cisco, Nvidia, Open Compute project, Supermicro, and others. By Michael Cooney Apr 19, 2023 3 mins Cisco Systems Linux Open Source PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe