Find Jobs
Hire Freelancers

Configure HDFS and Spark on DC/OS Cluster

€18-36 EUR / hour

종료됨
게시됨 5년 이상 전

€18-36 EUR / hour

We are running DC/OS Cluster on AWS, and manage it using Terraform. We have tried to deploy HDFS on DC/OS, but we still can't connect from locat to cluster's HDFS, and also can't consume the data from HDFS/ S3 using Spark, due to error on the miss of hadoop client. HDFS Cluster and Spark are set up and show healthy states. Client's [login to view URL] and [login to view URL] are also downloaded and placed under local hadoop installation, but. 1) 'hdfs dfs -ls /' lead to error 2) spark complains on problem to find hadop / hdfs Here are the links on the very same problems, but we haven't managed to resolved them. So we're looking for expert help that had already experience on running HDFS / DCOS, so point out where to change. Thanks [login to view URL] [login to view URL] Please answer for below questions when apply to job 1. Do you have experience with DC/OS installs and maintenance? 2. Do you ever configure DC/OS HDFS and DC/OS Spark 3. Have you managed custom Spark deployment (--packages, s3://) 4. Do you have suggestions to make this project run successfully? 5. What questions do you have about the project?
프로젝트 ID: 18324290

프로젝트 정보

10 제안서
원격근무 프로젝트
활동 중 5년 전

돈을 좀 벌 생각이십니까?

프리랜서 입찰의 이점

예산 및 기간 설정
작업 결과에 대한 급여 수급
제안의 개요를 자세히 쓰세요
무료로 프로젝트에 신청하고 입찰할 수 있습니다
10 이 프로젝트에 프리랜서들의 평균 입찰은 €30 EUR입니다./시간
사용자 아바타
Hello, I can Configure HDFS and Spark on your dc cluster. Please let me know if you are interested so we can discuss regarding the project over chat. Waiting for your reply. Thank you
€36 EUR 40일에
4.8 (25 건의 리뷰)
4.7
4.7
사용자 아바타
I am working as a software developer with 10+ years of experience. Industry experience includes Big Data, Backend, DevOps (AWS and Azure) and etc. I did my MTECH in the Department of Computer Science Engineering, Indian Institute of Technology (IIT) , Guwahati, India. Some of the skills are mentioned below. Amazon Web Services: EC2, Load Balancers, RDS, Elastic Beanstalk, Athena, Glue, EMR IAM, VPC, Security Groups, Cloud Formation, Lambda, S3, CLoudFront, Redshift, Route 53, Cloud Watch, Kinesis, SNS, SQS, API Gateway and etc. Azure: VMs, SQL Databases, Storage and etc. Programming Languages: C, C++, Java, Go, Python, Bash Big Data Technologies: Hadoop, Spark, Hive, HDFS, HBase, Kafka, Flume, Memsql, Aerospike, Redis and etc. Miscellaneous: Algorithms, Data Structures, Distributed Systems, Operating Systems Databases: Oracle, MySQL, SQL Server Operating Systems: Windows, Linux, Ubuntu and etc. Code Deployments: Puppet, Docker, Airflow
€30 EUR 15일에
4.9 (28 건의 리뷰)
4.8
4.8
사용자 아바타
Hello I am working in Bigdata/Hadoop technologies for years. I worked in Hadoop, Mapreduce, Spark (Streaming & MLLib), Kafka, ELK Stack, Cassandra, MongoDB, Postgre. I implemented above using Java Scala and Python. Have worked Shell, Perl and Python Can we talk further on this? Thank you!
€30 EUR 40일에
4.9 (22 건의 리뷰)
4.9
4.9
사용자 아바타
HI, I have more than 4+ years of experience in hadoop technologies like HDFS, MapReduce, Spark, Sqoop etc. Contact me for more detail
€33 EUR 40일에
4.6 (7 건의 리뷰)
3.7
3.7
사용자 아바타
Hi, I have 7 years of experience and working on hadoop, spark, nosql, java, BI tools(tableau, powerbi), cloud(Amazon, Google, Microsoft Azure)... Done end to end data warehouse management projects on aws cloud with hadoop, hive, spark and presodb. Worked on multiple etl project like Kafka, nifi, flume, mapreduce, spark with XML/JSON., Cassandra, mongodb, hbase, redis, oracle, sap hana, ASE.... Many more. Let's discuss the required things in detail. I am committed to work done and strong in issue resolving as well. Thanks
€20 EUR 40일에
4.9 (3 건의 리뷰)
2.0
2.0
사용자 아바타
I can help you with this project: - Well Experienced in Big Data (Hadoop | Kafka | Spark | NoSQL | Cloud) Administration and Platforms to Accommodate the Expanding Business Needs. - Well Experienced in Different Vendors (Hortonworks, Cloudera, MapR, etc) of Big Data and Cloud. - Big Data Capacity Planning, Tuning, Solution Architecting, Issue Resolution, and Security. - Big Data Analytics Solution Architecture, and its Implementation. Answers: 1. Do you have experience with DC/OS installs and maintenance? Yes 2. Do you ever configure DC/OS HDFS and DC/OS Spark Yes 3. Have you managed custom Spark deployment (--packages, s3://) Yes 4. Do you have suggestions to make this project run successfully? Once I get more insight into the use case, I can guide accordingly 5. What questions do you have about the project? Once I get more insight into the use case, I might have more questions Thanks, JJ
€30 EUR 40일에
0.0 (0 건의 리뷰)
0.0
0.0
사용자 아바타
Hello, Hope you are doing absolutely great. Instead of writing Proposal I would like to approach you directly for this job, as we have been working on the AWS since last 3 years. Let's take this forward as it looks like you need of an AWS expert to work on running HDFS / DCOS, so point out where to change. Hours of work: 40 Hr/week We have a working experience in node.js, MongoDB, Express JS, NoSQL Couch & Mongo, PostgreSQL, webRTC, MySQL, PHP, System Admin, frontend and backend development, load balancing, Video Broadcasting, Video Services, Amazon Web Services, Cloud Computing, Linux, Network Administration, System Admin, Database Management. So, here I’m confident that we will able to work on your project. Looking for to connect with you in chat. Best Regards, Akash!
€20 EUR 40일에
0.0 (0 건의 리뷰)
0.0
0.0

고객에 대한 정보

국기 (NETHERLANDS)
Netherlands
0.0
0
12월 8, 2018부터 회원입니다

고객 확인

감사합니다! 무료 크레딧을 신청할 수 있는 링크를 이메일로 보내드렸습니다.
이메일을 보내는 동안 문제가 발생했습니다. 다시 시도해 주세요.
등록 사용자 전체 등록 건수(일자리)
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
미리 보기 화면을 준비 중...
위치 정보 관련 접근권이 허용되었습니다.
고객님의 로그인 세션이 만료되어, 자동으로 로그아웃 처리가 되었습니다. 다시 로그인하여 주십시오.