Find Jobs
Hire Freelancers

Crawl data from world bank according to indicator names.

$30-250 USD

종료됨
게시됨 8개월 전

$30-250 USD

제출할때 지불됩니다
I am looking for a freelancer to write python program, to crawl data from the World Bank based on specific indicator names. The indicator names will be provided as arguments of the input python. 1. Should have a arguments of Proxy IPs and use random IP address according to the given IPs. 2. Should retry 3 times to craw data Passed arguments: (1)--config [login to view URL], including indicator name lists, may be json format, to specify global year range and specify indicator year range. For example: { "global": { "starttime": "2021-01-01", //YYYY-MM-DD, if not specify, no limit to starttime "endtime": "2021-01-01", //YYYY-MM-DD, if not specify, no limit to endtime "craw_interval": 3000, //ms, craw interval between each "proxies": [ //support multiple proxies, use random proxy. If proxy connect failed, try 3 times. { "proxy_type": "http", "proxy": "[login to view URL]" } ] }, "indicators": [ { "meta_name": "[login to view URL]", "indicator_name": "Population Ages", "indicator_code": "pop", "starttime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config "endtime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config }, { "meta_name": "[login to view URL]", //origin indicator name of world bank "indicator_name": "Population Ages", "indicator_code": "pop", "starttime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config "endtime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config } ], "mapping_dict": { "country": [ { "name": "USA", "code": "USA", "alias": [ // search every alias ignore case. "United States of America", "US" ] } ], "field": [ //use to remapping the output values when a single indicator contains multiple values fields. default field is value { "meta_name": "value", // meta_name is the original field name. the key of "value" represent original default value field. "meta_value": "value" }, { "meta_name": "score", // meta_name is the original field name. the key of "value" represent original default value field. "meta_value": "Final Score" } ] } } (2)--output output path of the final json. The output format should be json. Output format: { "indicators": [ { "meta_name": "[login to view URL]", "indicator_name": "Population Ages", "indicator_code": "pop", "starttime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config "endtime": "2021-01-01", //YYYY-MM-DD, if not specify, follow global config "status": "success", // if crawl failed, set status = failed. "errmsg": "OK", //if crawl failed, show error message here "total": 2939293, //total data entry counts "countries": 238823, //total country counts "years": 60 //total years }, ], "data": [ { "datasource": "worldbank", "ref_link": "[login to view URL]", //the indicator's original link "meta_name": "[login to view URL]", "indicator_name": "Population Ages", "indicator_code": "pop", "country_name": "USA", "country_code": "USA", "crawl_time": "2022-01-01 12:00:00", "year": "2022", "starttime": "2022-01-01", // if data source only contains year like 2022, set it to start of the year "endtime": "2022-12-30", // if data source only contains year like 2022, set it to end of the year "values": { // use field mapping to convert to new field name first. the key should be the new field name. "value": 123, "Final Score": 300 } } ] } 3. Output should contains (1) failed indicator names and failed reason (2) remapping indicator_name and value fields [login to view URL] provide (1)[login to view URL] (2)Python code (3)A simple test case([login to view URL] and test command) Skills and Experience: - Strong experience in web scraping and data crawling - Proficiency in Python or another suitable programming language for web scraping - Familiarity with the World Bank's data structure and API Data Format: - The crawled data should be in JSON format. Data Cleaning and Structuring: - The client requires the data to be cleaned and structured according to specific data attributes. Please provide examples of similar projects you have completed in the past.
프로젝트 ID: 37219018

프로젝트 정보

23 제안서
원격근무 프로젝트
활동 중 7개월 전

돈을 좀 벌 생각이십니까?

프리랜서 입찰의 이점

예산 및 기간 설정
작업 결과에 대한 급여 수급
제안의 개요를 자세히 쓰세요
무료로 프로젝트에 신청하고 입찰할 수 있습니다
23 이 프로젝트에 프리랜서들의 평균 입찰은 $189 USD입니다.
사용자 아바타
Hi! I'm George, a web crawler expert with many successful data crawling projects in my portfolio. I'm confident I can provide you with the necessary expertise to meet your requirements for this job. I have the necessary skills including experience in web scraping, proficiency in Python (or any other programming language) for web scraping, and complete familiarity with the World Bank's data structure and API. I'm capable of outputting the crawled data in JSON format, cleaning and structuring it according to specific data attributes. In the past, I have developed custom web scrapping tools, and used them to crawl different types of data and websites. I am confident I have the right knowledge and technical skills to handle your project efficiently. Questions: 1. What is the expected delivery timeline? 2. Do the arguments need to be validated before accessing the world bank source? 3. Can you provide sample data and an example of what the output should look like? 4. What is the primary language for the project? 5. Will support be provided if needed during the project's development?
$500 USD 3일에
5.0 (127 건의 리뷰)
7.7
7.7
사용자 아바타
hi I can provide you Python script as well as data in JSON data, according to the format you shared, from World Bank website. I can start right away. Abdul H.
$150 USD 1일에
5.0 (147 건의 리뷰)
6.8
6.8
사용자 아바타
Top 1% in Freelancer.com Hi, Greetings! ✅checked your project details: ✅Completed Time: In project deadline We have worked on 900 + Projects. I have 6 + years of the experience in same kind of projects. If you are looking for a true Freelancer, I am the Right person for you. I am available almost 24-7 and am very responsive. I feel proud that I am a trusted Freelancer who pleases almost every single client. You can rest assure, your work will be delivered well in advance of others, with passion and accuracy. I guarantee you instant communication & responses when you need me. Why choose me? I think every client is the reason for my success. I only take projects which I am sure I can do quickly. My Portfolio Items: https://www.freelancer.com/u/schoudhary1553 I would really like to work with you on this project. If interested, Kindly contact me via chat for further details and discussion. Thank you Sandeep
$180 USD 3일에
4.9 (92 건의 리뷰)
6.9
6.9
사용자 아바타
Hello Suley! I hope you're well. I'm a senior Scraper developer with experience in developing scraper using scrappy and headless browsers. I can deal with bypassing IP throttling limit, ban and captcha solve, storing the result in JSON, CSV and excel files. I've delivered more than 100 projects over time with 5* rating. Here are some of my skills necessary for this task. ➢ Python: Deep understanding of Python and libraries like Scrappy, Proxy, Beautifulsoup, lxml, Captcha ➢ Tools: Headless Browser, Selenium, Playwright ➢ Databases: MySQL, Postgres, Oracle, MongoDB ➢ Source Code Management: Git, GitLab, Bit-bucket, SVN ➢ Cloud Providers: AWS, GCP and Azure ➢ Containerisation: Docker, Kubernetes Best, Sonu
$200 USD 7일에
4.9 (54 건의 리뷰)
6.2
6.2
사용자 아바타
World bank scraper in python. Dear, Client I am thrilled to express my interest in your web scraping and automation project. I am confident in my ability to deliver outstanding results that align with your requirements. I am excited about the opportunity to contribute to your project's success. My service 1: Scraping API or Html or JavaScript rendering web pages using python requests, scrapy, selenium and bs4 2: Fast scraping without blocking by multi-threading, aiohttp asycnio,Proxy combining, ReCaptcha bypass and Stealth browsers. 3: Console or GUI-based scraper using python Tkinter or Pyqt5 4: Real-time stream data scraping via client WebSocket. 5: Notification to users at several conditions while monitoring by using webhook. 6: Storing into CSV or google Sheets or uploading into DB such as Mysql, MongoDB, and Postgresql. 7:Web app that shows scraping or lives monitoring results. 8:Scraper schedule running by Cron. Looking forward to the possibility of working together. Best Regards Manoj
$250 USD 3일에
5.0 (9 건의 리뷰)
5.0
5.0
사용자 아바타
Dear sir, I have gone your project, and it matches my expertise. I'm confident of performing your project work. I have 6+ years of experience in Web scraping, Python, Debugging, Software architecture. I can do your project accurately with your lowest hourly rate that you want. The quality of the project will be top class that I can assure you Thank you.
$140 USD 2일에
5.0 (18 건의 리뷰)
5.1
5.1
사용자 아바타
Hi Sir, As a highly skilled and experienced, I am confident that I can provide the high-quality work you need. I am ready to start the work right away. Thank you.
$140 USD 7일에
5.0 (8 건의 리뷰)
4.4
4.4
사용자 아바타
Dear Client, I am excited to submit my proposal for your project, which involves writing a Python program to crawl data from the World Bank based on specific indicator names. I have extensive experience in web scraping, data crawling, and data structuring, making me well-suited for this task. I am committed to delivering high-quality code and meeting your project's requirements within the specified timeframe. If you believe that my qualifications align with your project needs, I would welcome the opportunity to discuss the project in more detail and provide a tailored proposal. Thank you for considering my bid, and I look forward to the possibility of collaborating with your team. Best regards, Lalit
$220 USD 7일에
5.0 (16 건의 리뷰)
3.7
3.7
사용자 아바타
Hi sir, I'm excited about your project and confident in my ability to deliver your project . I'm committed to exceeding your expectations and ready to start from right away . Thank you .
$140 USD 7일에
5.0 (4 건의 리뷰)
3.4
3.4
사용자 아바타
I am professional Swift coder with skills including Web Crawling and Python. Please contact me to discuss more about this project. Thank you
$100 USD 3일에
5.0 (1 건의 리뷰)
3.4
3.4
사용자 아바타
Hello, I am DataScinceFizer I would like to work on using Python to create Automated Scripts Scraping tools. I have extensive experience in Python to create website scraping tools to collect data from your target website Selenium Automation, I have a degree in Data Science and specialize in Machine Learning. My major work is electrical engineering and I have worked as a data science engineer for 13+ years with unstoppable excellent delivery. Additionally, I provide services 24 hours without any limitations, and deliveries are assumed to be on time. Furthermore, rapid action on services is my motto which shows my commitment to customer satisfaction.@DataScinceFizer
$80 USD 2일에
4.7 (4 건의 리뷰)
2.1
2.1
사용자 아바타
Hello, I am writing to bid on your project to write a Python program to crawl data from the World Bank based on specific indicator names. I have strong experience in web scraping and data crawling, and I am proficient in Python. I am also familiar with the World Bank's data structure and API. I propose to complete this project for a fixed fee of 170. I will provide you with a detailed timeline and deliverables schedule once we have discussed your specific requirements. Here is a high-level overview of my approach to this project: 1.I will develop a Python program to crawl the World Bank's API and download the data for the specified indicator names. 2.I will use proxy servers to avoid getting blocked by the World Bank. 3.I will retry the crawl 3 times if it fails. 4.I will clean and structure the data according to your specific requirements. 5.I will save the data to a JSON file in the specified output path. 6.I have attached a sample Python code and a simple test case for your review. I am confident that I have the skills and experience necessary to complete this project successfully. I am committed to providing you with high-quality work and meeting your deadlines. Thank you for your time and consideration. I look forward to hearing from you soon. Sincerely, Teersingh
$220 USD 7일에
5.0 (4 건의 리뷰)
1.8
1.8
사용자 아바타
Hello there! I understand you are looking for a freelancer to write a python program that will crawl data from the World Bank based on specific indicator names. I believe I am the best fit for this project because of my experience in coding and specifically in Python. I have a degree in Computer Science, which has taught me the importance of keeping code error-free and efficient. This is an important skill when it comes to writing programs such as this one, as any mistakes made during development could result in the program failing to complete its task within the desired time frame. I would be more than happy to help you with your project. Please let me know if you'd like me to discuss further or answer any additional questions you may have about my skills or experience.
$99 USD 1일에
5.0 (2 건의 리뷰)
0.0
0.0
사용자 아바타
Hello, I have the web scraping skills that you require to attain the data you need from the World Bank. To further create an understanding, I would suggest that you share a portion of the project detail so that I can first provide you with a sample of my work, Regards.
$100 USD 5일에
0.0 (0 건의 리뷰)
0.0
0.0

고객에 대한 정보

국기 (UNITED STATES)
beijing, United States
5.0
1
결제 수단 확인
11월 12, 2019부터 회원입니다

고객 확인

감사합니다! 무료 크레딧을 신청할 수 있는 링크를 이메일로 보내드렸습니다.
이메일을 보내는 동안 문제가 발생했습니다. 다시 시도해 주세요.
등록 사용자 전체 등록 건수(일자리)
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
미리 보기 화면을 준비 중...
위치 정보 관련 접근권이 허용되었습니다.
고객님의 로그인 세션이 만료되어, 자동으로 로그아웃 처리가 되었습니다. 다시 로그인하여 주십시오.