- Full Time
- National Capital Reg- Manila City
- Salary As Per Industry Standard
Work Address : Metro Manila
Key Job Functions and Duties:
Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
Cluster maintenance for Cloudera Management Cluster
Performance tuning of Hadoop clusters, MapReduce, Yarn and Spark
Screen Hadoop cluster job performances and capacity planning
Troubleshooting errors on Infrastructure and Platform level (Network/Connectivity/System) encountered by the team
Monitor Hadoop cluster connectivity and security Manage and review Hadoop log files.
File system management and monitoring.
HDFS support and maintenance.
Cloud Administration (AWS/Azure)
Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
Point of Contact for Vendor escalation
General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
Hadoop skills like HBase, Hive, Impala, Yarn, MapReduce, Sqoop, Spark etc.
The most essential requirements are: They should be able to deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups.
Solid knowledge of Linux/Redhat operating system as Hadoop runs on Linux base OS.
Familiarity with open source configuration management and deployment tools such as Puppet, Chef, Ansible and Linux scripting.
Knowledgeable in Cloud Infrastructure and Platform (AWS/Azure/GCE)
Knowledge of Troubleshooting Core Java Applications is a plus
. Knowledgeable of Kubernetes and Docker is a plus
Desired Experience & Qualities
Broad technical background with at least three year’s work experience in a similar role as a Hadoop Administrator
Knowledge of RHEL7/CentOS7 preferable 1 to 3 Years Kafka Administrator Preferred: Experience in secure environments (Kerberized, Encrypted) in Kafka
Preferred: Experience in other apache big data applications such as HBase, Cassandra, Storm, etc.
FARM WORKERBahama Realty and Development Corporation
LIVING IN RIZAL, WITH EXPERIENCE IN HARVESTING EGGS, POULTRY
Digital Events CoordinatorJ-K Network Manpower Services
Job Requirements: Bachelor’s degree holder in Marketing or any related course At least 2 years of experience in a client success role At least 2 years with strong administrative and process skills Willing to work in Cebu City Can start as soon as possible
SENIOR WEB FUNNEL DESIGNERThomas Digital
FINANCIAL CONSULTANTPRU LIFE UK
20-40 years old College Graduate Willing to be trained, coached daily for 18 months Must own an iPad or laptop Must have Mobile Postpaid Plan Must have Stable Internet Connection
CONSTRUCTION WORKERBahama Realty and Development Corporation
WITH EXPERIENCE IN CONSTRUCTION AS MULTI SKILLED
To Apply For This Job You Have To Go Through Small Test . So Be Ready For That.To Apply Click On Proceed .