Unlock the potential of #Hadoop for large-scale data processing. Niklas Lang's comprehensive guide covers Hadoop's architecture, installation in different environments, and essential commands.
Unlock the potential of #Hadoop for large-scale data processing. Niklas Lang's comprehensive guide covers Hadoop's architecture, installation in different environments, and essential commands.
Any hadoop experts out there looking for some consulting? Got a hadoop cluster that needs some expert TLC.
This is a customer-facing role, so if that's not your thing, keep scrolling.
TLDR: If you know Hadoop and live close enough to Belfast to commute, you should apply.
I've posted this before, but it's been a little while #fedihire. Also, adding some additional information this time. This is my team. We are already on three continents and 6 timezones, but #Belfast is a new location for the team. I know literally nothing about the office.
I know a lot of places Hadoop is the past, and sure we see a ton of #Spark (I do not understand why that is not listed in the job description but maybe because they want to emphasis that we need hadoop expertise?). You can see all the projects we support at https://www.openlogic.com/supported-technology
It depends on how you count, as I was on two teams during tradition, but I've been on this team for over 5 years now. It's a great team. I've been with the company now right at 7 years. I cannot say how we compare to Belfast employers but this is well more than double where I have stayed at any other employer (even if you count UNC-CH as a single employer rather than the different departments, I've beat them by well over a year at this point).
My manager has been on this team for almost 15 years. His manager has been with this team for almost as long as me, but with the company much longer. His manager has been here almost as long as me (I actually did orientation with him). His manager is a her and she's been here almost as long as me. So, obviously, this is a place where people want to stay!
Our team has a lot of testosterone, but when I started, our CEO was a woman. The GM for the division is a woman.
My manager is black. The manager of our sister team is black.
I think you'll find our team and company is concerned about your work product and not how you dress, what bathroom you use, or the color of your skin.
If you take a look at our careers page, you'll see this:
Work Should Be Fun
There’s always something to look forward to as a Perforce employee: scavenger hunts, community lunches, summer events, virtual games, and year-end celebrations just to name a few.
We take that shit seriously. Nauseatingly so sometimes, lol.
Actually, we take everything on the careers page seriously, but I know from experience that some places treat support like they are a shoe sole to be worn down. Not so here. It's not all rainbows and sunshine, of course. The whole point is that the customer is having an issue! Our customers treat us with respect because management demands that they do.
------
The Director of Product Development at Perforce is searching for a Enterprise Architect (#BigData Solutions) to join the team. We are looking for an individual who loves data solutions, views technology as a lifestyle, and has a passion for open source software. In this position, you’ll get hands on experience building, configuring, deploying, and troubleshooting our big data solutions, and you’ll contribute to our most strategic product offerings.
At OpenLogic we do #opensource right, and our people make it happen. We provide the technical expertise required for maintaining healthy implementations of hundreds of integrated open source software packages. If your skills meet any of the specs below, now is the time to apply to be a part of our passionate team.
Responsibilities:
Troubleshoot and conduct root cause analysis on enterprise scale big data systems operated by third-party clients. Assisting them in resolving complex issues in mission critical environments.
Install, configure, validate, and monitor a bundle of open source packages that deliver a cohesive world class big data solution.
Evaluate existing Big Data systems operated by third-party clients and identify areas for improvement.
Administer automation for provisioning and updating our big data distribution.
Requirements:
Demonstrable proficiency in #Linux command-line essentials
Strong #SQL and #NoSQL background required
Demonstrable experience designing or testing disaster recovery plans, including backup and recovery
Must have a firm understanding of the #Hadoop ecosystem, including the various open source packages that contribute to a broader solution, as well as an appreciation for the turmoil and turf wars among vendors in the space
Must understand the unique use cases and requirements for platform specific deployments, including on-premises vs cloud vs hybrid, as well as bare metal vs virtualization
Demonstrable experience in one or more cloud-based technologies (AWS or Azure preferred)
Experience with #virtualization and #containerization at scale
Experience creating architectural blueprints and best practices for Hadoop implementations
Some programming experience required
#Database administration experience very desirable
Experience working in enterprise/carrier production environments
Understanding of #DevOps and automation concepts
#Ansible playbook development very desirable
Experience with #Git-based version control
Be flexible and willing to support occasional after-hours and weekend work
Experience working with a geographically dispersed virtual team
https://jobs.lever.co/perforce/479dfdd6-6e76-4651-9ddb-c4b652ab7b74
come work on my team! #fedihire
Position Summary:
The Director of Product Development at Perforce is searching for a Enterprise Architect (Big Data Solutions) to join the team. We are looking for an individual who loves data solutions, views technology as a lifestyle, and has a passion for open source software. In this position, you’ll get hands on experience building, configuring, deploying, and troubleshooting our big data solutions, and you’ll contribute to our most strategic product offerings.
At OpenLogic we do #opensource right, and our people make it happen. We provide the technical expertise required for maintaining healthy implementations of hundreds of integrated open source software packages. If your skills meet any of the specs below, now is the time to apply to be a part of our passionate team.
Responsibilities:
Troubleshoot and conduct root cause analysis on enterprise scale big data systems operated by third-party clients. Assisting them in resolving complex issues in mission critical environments.
Install, configure, validate, and monitor a bundle of open source packages that deliver a cohesive world class big data solution.
Evaluate existing Big Data systems operated by third-party clients and identify areas for improvement.
Administer automation for provisioning and updating our big data distribution.
Requirements:
Demonstrable proficiency in #Linux command-line essentials
Strong #SQL and #NoSQL background required
Demonstrable experience designing or testing disaster recovery plans, including backup and recovery
Must have a firm understanding of the #Hadoop ecosystem, including the various open source packages that contribute to a broader solution, as well as an appreciation for the turmoil and turf wars among vendors in the space
Must understand the unique use cases and requirements for platform specific deployments, including on-premises vs cloud vs hybrid, as well as bare metal vs #virtualization
Demonstrable experience in one or more cloud-based technologies (#AWS or #Azure preferred)
Experience with virtualization and containerization at scale
Experience creating architectural blueprints and best practices for Hadoop implementations
Some programming experience required
#Database administration experience very desirable
Experience working in enterprise/carrier production environments
Understanding of #DevOps and automation concepts
#Ansible playbook development very desirable
Experience with #Git-based version control
Be flexible and willing to support occasional after-hours and weekend work
Experience working with a geographically dispersed virtual team
Apply at https://jobs.lever.co/perforce/479dfdd6-6e76-4651-9ddb-c4b652ab7b74
«...на декабрь 2019 года входит в топ-5 контрибьюторов #OpenJDK вместе с Oracle, Red Hat, SAP и Google.
Компания выпускает и поддерживает #Liberica JDK — Java-дистрибутив на основе #OpenJDK для расширенного набора платформ, включая контейнеризованные сборки c #AlpineLinux. «БеллСофт» имеет лицензию TCK, поэтому Liberica JDK гарантированно соответствует стандарту #JavaSE.
В то же время, компания разрабатывает #LiberCat - российский сервер приложений #JavaEE на основе проекта с открытым исходным кодом #ApacheTomcat.
Деятельность компании также связана с развитием и поддержкой других сложных продуктов с открытым кодом, таких как OpenJDK, компиляторов #gcc и #LLVM и платформы работы с большими данными #Hadoop ...»
HIRING: Lead Software Engineer - AI / New York City USD 195K+
In today’s #data-driven world, having #inhouse, #outsourced, or #dedicated data specialists is crucial for making informed business decisions.
Whether you need an in-house team or an outsourced data engineering service, skilled professionals ensure that your business can collect, process, and extract valuable insights from data, enabling better decision-making, operational efficiency, and competitive advantage.
Info: https://www.ibm.com/think/topics/data-engineering
What is Big Data & Hadoop Technology? Uncover the essentials of Big Data and explore how Hadoop revolutionizes data storage and processing at scale.
#BigData #Hadoop #DataScience #AI #Innovation
READ MORE HERE: https://buff.ly/3UbQLhZ
Hadoop Core Components: HDFS, YARN & MapReduce: Learn about the fundamental building blocks of Hadoop and how they work together to process big data efficiently.
#BigData #Hadoop #DataScience #Innovation #AI
READ MORE HERE: https://buff.ly/3C1A96o
Hadoop & NoSQL: HBase, Cassandra & MongoDB: Explore how these technologies revolutionize data storage and processing for large-scale applications.
#BigData #NoSQL #Hadoop #DataScience #Innovation #AI
READ MORE HERE: https://buff.ly/40epsYw
@theluddite I once had a client ask us to "install #Hadoop" for them so they could have a place to store a dozen #Excel files. Thankfully we convinced them to go with a better solution.
I don't expect anyone to still believe that #Java is slow. But if you do, remember that the following tools are all written in Java or other #JVM languages: #Cassandra, #Hadoop, #Spark, #Kafka, #Elasticsearch, #DynamoDB...
Leggi il mio nuovo articolo su Linux/hub
#howto - installare hadoop e hdfs (https://linuxhub.it/articles/howto-installazione-hadoop/)