基于alpine用dockerfile創(chuàng)建的爬蟲(chóng)Scrapy鏡像的實(shí)現(xiàn)
一、下載alpine鏡像
[root@DockerBrian ~]# docker pull alpine Using default tag: latest Trying to pull repository docker.io/library/alpine ... latest: Pulling from docker.io/library/alpine 4fe2ade4980c: Pull complete Digest: sha256:621c2f39f8133acb8e64023a94dbdf0d5ca81896102b9e57c0dc184cadaf5528 Status: Downloaded newer image for docker.io/alpine:latest [root@docker43 ~]# docker images REPOSITORY TAG IMAGE ID CREATED SIZE docker.io/alpine latest 196d12cf6ab1 3 weeks ago 4.41 MB
二、編寫(xiě)Dockerfile
創(chuàng)建scrapy目錄存放dockerfile文件
[root@DockerBrian ~]# mkdir /opt/alpineDockerfile/ [root@DockerBrian ~]# cd /opt/alpineDockerfile/ [root@DockerBrian alpineDockerfile]# mkdir scrapy && cd scrapy && touch Dockerfile [root@DockerBrian alpineDockerfile]# cd scrapy/ [root@DockerBrian scrapy]# ll 總用量 4 -rw-r--r-- 1 root root 1394 10月 10 11:36 Dockerfile
編寫(xiě)dockerfile文件
# 指定創(chuàng)建的基礎(chǔ)鏡像 FROM alpine # 作者描述信息 MAINTAINER alpine_python3_scrapy (zhujingzhi@123.com) # 替換阿里云的源 RUN echo "http://mirrors.aliyun.com/alpine/latest-stable/main/" > /etc/apk/repositories && \ echo "http://mirrors.aliyun.com/alpine/latest-stable/community/" >> /etc/apk/repositories # 同步時(shí)間 # 更新源、安裝openssh 并修改配置文件和生成key 并且同步時(shí)間 RUN apk update && \ apk add --no-cache openssh-server tzdata && \ cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && \ sed -i "s/#PermitRootLogin.*/PermitRootLogin yes/g" /etc/ssh/sshd_config && \ ssh-keygen -t rsa -P "" -f /etc/ssh/ssh_host_rsa_key && \ ssh-keygen -t ecdsa -P "" -f /etc/ssh/ssh_host_ecdsa_key && \ ssh-keygen -t ed25519 -P "" -f /etc/ssh/ssh_host_ed25519_key && \ echo "root:h056zHJLg85oW5xh7VtSa" | chpasswd # 安裝Scrapy依賴(lài)包(必須安裝的依賴(lài)包) RUN apk add --no-cache python3 python3-dev gcc openssl-dev openssl libressl libc-dev linux-headers libffi-dev libxml2-dev libxml2 libxslt-dev openssh-client openssh-sftp-server # 安裝環(huán)境需要pip包(這里的包可以按照需求添加或者刪除) RUN pip3 install --default-timeout=100 --no-cache-dir --upgrade pip setuptools pymysql pymongo redis scrapy-redis ipython Scrapy requests # 啟動(dòng)ssh腳本 RUN echo "/usr/sbin/sshd -D" >> /etc/start.sh && \ chmod +x /etc/start.sh # 開(kāi)放22端口 EXPOSE 22 # 執(zhí)行ssh啟動(dòng)命令 CMD ["/bin/sh","/etc/start.sh"]
實(shí)現(xiàn)了容器可以SSH遠(yuǎn)程訪問(wèn) 基于Python3 環(huán)境安裝的Scrapy,通過(guò)start.sh腳本啟動(dòng)SSH服務(wù)
三、創(chuàng)建鏡像
創(chuàng)建鏡像
[root@DockerBrian scrapy]# docker build -t scrapy_redis_ssh:v1 .
查看鏡像
[root@DockerBrian scrapy]# docker images REPOSITORY TAG IMAGE ID CREATED SIZE scrapy_redis_ssh v1 b2c95ef95fb9 4 hours ago 282 MB docker.io/alpine latest 196d12cf6ab1 4 weeks ago 4.41 MB
四、創(chuàng)建容器
創(chuàng)建容器(名字為scrapy10086 遠(yuǎn)程端口是映射宿主機(jī)10086端口)
查看容器
[root@DockerBrian scrapy]# docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 7fb9e69d79f5 b2c95ef95fb9 "/bin/sh /etc/star..." 3 hours ago Up 3 hours 0.0.0.0:10086->22/tcp scrapy10086
登錄容器
[root@DockerBrian scrapy]# ssh root@127.0.0.1 -p 10086 The authenticity of host '[127.0.0.1]:10086 ([127.0.0.1]:10086)' can't be established. ECDSA key fingerprint is SHA256:wC46AU6SLjHyEfQWX6d6ht9MdpGKodeMOK6/cONcpxk. ECDSA key fingerprint is MD5:6a:b7:31:3c:63:02:ca:74:5b:d9:68:42:08:be:22:fc. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added '[127.0.0.1]:10086' (ECDSA) to the list of known hosts. root@127.0.0.1's password: # 這里的密碼就是dockerfile中定義的 echo "root:h056zHJLg85oW5xh7VtSa" | chpasswd Welcome to Alpine! The Alpine Wiki contains a large amount of how-to guides and general information about administrating Alpine systems. See <http://wiki.alpinelinux.org>. You can setup the system with the command: setup-alpine You may change this message by editing /etc/motd. 7363738cc96a:~#
五、測(cè)試
創(chuàng)建個(gè)scrapy項(xiàng)目測(cè)試
7363738cc96a:~# scrapy startproject test New Scrapy project 'test', using template directory '/usr/lib/python3.6/site-packages/scrapy/templates/project', created in: /root/test You can start your first spider with: cd test scrapy genspider example example.com 7363738cc96a:~# cd test/ 7363738cc96a:~/test# ls scrapy.cfg test 7363738cc96a:~/test# cd test/ 7363738cc96a:~/test/test# ls __init__.py __pycache__ items.py middlewares.py pipelines.py settings.py spiders 7363738cc96a:~/test/test#
測(cè)試成功
以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持腳本之家。
相關(guān)文章
docker之MySQL同步數(shù)據(jù)的實(shí)現(xiàn)
本文主要介紹了docker之MySQL同步數(shù)據(jù)的實(shí)現(xiàn),文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友們下面隨著小編來(lái)一起學(xué)習(xí)學(xué)習(xí)吧2022-07-07Docker默認(rèn)網(wǎng)段修改實(shí)現(xiàn)方法解析
這篇文章主要介紹了Docker默認(rèn)網(wǎng)段修改實(shí)現(xiàn)方法解析,文中通過(guò)示例代碼介紹的非常詳細(xì),對(duì)大家的學(xué)習(xí)或者工作具有一定的參考學(xué)習(xí)價(jià)值,需要的朋友可以參考下2020-11-11Docker容器內(nèi)部無(wú)法訪問(wèn)外網(wǎng)原因以及解決辦法
最近在工作時(shí)遇到一個(gè)問(wèn)題,這里給大家總結(jié)下,這篇文章主要給大家介紹了關(guān)于Docker容器內(nèi)部無(wú)法訪問(wèn)外網(wǎng)原因以及解決辦法,文中給大家介紹的非常詳細(xì),需要的朋友可以參考下2023-06-06Docker-compose的安裝和設(shè)定詳細(xì)步驟
這篇文章主要介紹了Docker-compose的安裝和設(shè)定的相關(guān)資料,需要的朋友可以參考下2017-03-03