91超碰碰碰碰久久久久久综合_超碰av人澡人澡人澡人澡人掠_国产黄大片在线观看画质优化_txt小说免费全本

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

Centos7中怎么安裝Scrapy

發布時間:2021-07-12 14:45:04 來源:億速云 閱讀:172 作者:Leah 欄目:云計算

Centos7中怎么安裝Scrapy,很多新手對此不是很清楚,為了幫助大家解決這個難題,下面小編將為大家詳細講解,有這方面需求的人可以來學習下,希望你能有所收獲。

一、安裝開發包組、升級操作系統

#yum groupinstall "Development Tools" -y
#yum update -y

注:

  1、如果你的系統上的python不是python2.7以上版本請升級到python2.7以上版本(由于Scrapy 需要python 2.7 以上的版本)

#下載python2.7

#wget http://python.org/ftp/python/2.7.3/Python-2.7.3.tar.bz2

#解壓 

#tar -jxvf Python-2.7.3.tar.bz2  
#cd Python-2.7.3

#安裝

#./configure
#make all           
#make install
#make clean
#make distclean

#查看python 版本

#/usr/local/bin/python2.7 -V

#建立軟連接,使系統默認的 python指向 python2.7

#mv /usr/bin/python /usr/bin/python2.6.6
#ln -s /usr/local/bin/python2.7 /usr/bin/python

#解決系統 Python 軟鏈接指向 Python2.7 版本后,因為yum是不兼容 Python 2.7的,所以yum不能正常工作,我們需要指定 yum 的Python版本

vim /usr/bin/yum

將文件頭部的

#!/usr/bin/python

改成

#!/usr/bin/python2.6.6

  2、強烈建議升級python2.7后再安裝pip與setuptools,如果不這樣操作會出現很多莫明的問題,讓你酸爽到天明!!

  3、如果你是升級到python2.7,更大的可能性是全部通過python setup.py 編譯安裝,所需要的包含但不限于這些包

lxml,zope.interface,Twisted,characteristic,pyasn1-modules,service-identity,Scrapy

  PS:我一開始就是編譯安裝的,其中最多的問題是:

error:command 'gcc' failed with exit status 1

   后來我發現,如果有這樣的提示不是缺少devel包就是少某一個lib庫文件;最令我哭笑不得是安裝Scrapy 提示成功,但無法創建項目,測試樣例都跑不了,最終我果斷的換centos7了!

以下內容都是Centos 7上的操作,升級到python2.7的同學請繞行

二、vim /etc/yum.repo/rpmforge.repo 指定rpmforge,來安裝liffi-devel【如果不指定源,yum install liffi-devel會提示沒有找到】,原博文中的方法不能用,自己又搜索了方案,使用下面代碼覆蓋上面打開的文件

#Name: RPMforge RPM Repository for Red Hat Enterprise 5 - dag
#URL: http://rpmforge.net/
[rpmforge]
name = Red Hat Enterprise $releasever - RPMforge.net - dag
#baseurl = http://apt.sw.be/redhat/el5/en/$basearch/dag
mirrorlist = http://apt.sw.be/redhat/el5/en/mirrors-rpmforge
#mirrorlist = file:///etc/yum.repos.d/mirrors-rpmforge
enabled = 1
protect = 0
gpgkey = file:///etc/pki/rpm-gpg/RPM-GPG-KEY-rpmforge-dag
gpgcheck = 1

運行下面命令

sudo rpm --import http://apt.sw.be/RPM-GPG-KEY.dag.txt
sudo yum install libffi-devel

原解決方案:http://www.lxway.com/164125081.htm

注:沒有rpmforge的需要先安裝

rpmforge是Dag、Dries 和其它軟件包的組合。它們為 CentOS 提供了超過10000個軟件包。rpmforge不是redhat Linux產品或 CentOS 的組成部分,但它是為這些 Linux 套件而設計的。

注釋:因為這個安裝源不是CentOS 本身的組成部分,要使用rpmforge,必須先安裝rpmforce這個Repository。

獲取方式

#32位:  

wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.1-1.el5.rf.i386.rpm  
rpm -ivh rpmforge-release-0.5.1-1.el5.rf.i386.rpm

#64位:  

wget http://packages.sw.be/rpmforge-release/rpmforge-release-0.5.1-1.el5.rf.x86_64.rpm

安裝:

rpm -ivh rpmforge-release-0.5.1-1.el5.rf.x86_64.rpm

安裝好之后在/etc/yum.repos.d 目錄下面生成:

mirrors-rpmforge  --包含一系列的鏡像站點

rpmforge.repo     --yum源的配置文件

rpmforge-testing.repo --測試用的。

三、如果系統中安裝有audit這個包請先移除,它會影響到Scrapy的安裝

#yum remove audit

四、安裝Scarpy 所需要的開發包

#yum install -y python-devel openssl-devel libxslt-devel libxml2-devel

五、安裝pip與setuptools

#yum install python-pip -y

顯示沒有可用源:

這是因為像centos這類衍生出來的發行版,他們的源有時候內容更新的比較滯后,或者說有時候一些擴展的源根本就沒有。所以在使用yum來search  python-pip的時候,會說沒有找到該軟件包。因此為了能夠安裝這些包,需要先安裝擴展源EPEL。EPEL(http://fedoraproject.org/wiki/EPEL) 是由 Fedora 社區打造,為 RHEL 及衍生發行版如 CentOS、Scientific Linux 等提供高質量軟件包的項目。

首先安裝epel擴展源:

sudo yum -y install epel-release

然后安裝python-pip

sudo yum -y install python-pip

安裝完之后別忘了清除一下cache

sudo yum clean all
#pip install  setuptools
#pip install setuptoos --upgrade

六、安裝Scrapy

# pip install Scrapy

Collecting Scrapy

  Using cached Scrapy-1.0.3-py2-none-any.whl

Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in /usr/lib/python2.7/site-packages (from Scrapy)

Requirement already satisfied (use --upgrade to upgrade): queuelib in /usr/lib/python2.7/site-packages (from Scrapy)

Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in /usr/lib/python2.7/site-packages (from Scrapy)

Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.8.0 in /usr/lib/python2.7/site-packages (from Scrapy)

Collecting lxml (from Scrapy)

  Using cached lxml-3.4.4.tar.gz

Collecting Twisted>=10.0.0 (from Scrapy)

  Using cached Twisted-15.4.0.tar.bz2

Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in /usr/lib/python2.7/site-packages (from Scrapy)

Collecting service-identity (from Scrapy)

  Using cached service_identity-14.0.0-py2.py3-none-any.whl

Requirement already satisfied (use --upgrade to upgrade): cryptography>=0.7 in /usr/lib64/python2.7/site-packages (from pyOpenSSL->Scrapy)

Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy)

  Using cached zope.interface-4.1.3.tar.gz

Collecting characteristic>=14.0.0 (from service-identity->Scrapy)

  Using cached characteristic-14.3.0-py2.py3-none-any.whl

Collecting pyasn1-modules (from service-identity->Scrapy)

  Using cached pyasn1_modules-0.0.8-py2.py3-none-any.whl

Requirement already satisfied (use --upgrade to upgrade): pyasn1 in /usr/lib/python2.7/site-packages (from service-identity->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in /usr/lib/python2.7/site-packages (from cryptography>=0.7->pyOpenSSL->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): setuptools in /usr/lib/python2.7/site-packages (from cryptography>=0.7->pyOpenSSL->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): enum34 in /usr/lib/python2.7/site-packages (from cryptography>=0.7->pyOpenSSL->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): ipaddress in /usr/lib/python2.7/site-packages (from cryptography>=0.7->pyOpenSSL->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): cffi>=1.1.0 in /usr/lib64/python2.7/site-packages (from cryptography>=0.7->pyOpenSSL->Scrapy)

Requirement already satisfied (use --upgrade to upgrade): pycparser in /usr/lib/python2.7/site-packages (from cffi>=1.1.0->cryptography>=0.7->pyOpenSSL->Scrapy)

Installing collected packages: lxml, zope.interface, Twisted, characteristic, pyasn1-modules, service-identity, Scrapy

  Running setup.py install for lxml

  Running setup.py install for zope.interface

  Running setup.py install for Twisted

Successfully installed Scrapy-1.0.3 Twisted-15.4.0 characteristic-14.3.0 lxml-3.4.4 pyasn1-modules-0.0.8 service-identity-14.0.0 zope.interface-4.1.3

七、創建項目

[root@localhost workspace]

# scrapy startproject tutorial

2015-10-15 21:54:24 [scrapy] INFO: Scrapy 1.0.3 started (bot: scrapybot)

2015-10-15 21:54:24 [scrapy] INFO: Optional features available: ssl, http11

2015-10-15 21:54:24 [scrapy] INFO: Overridden settings: {}

New Scrapy project 'tutorial' created in:

    /workspace/tutorial

You can start your first spider with:

    cd tutorial

    scrapy genspider example example.com

八、目錄結構

[root@localhost workspace]

# tree

.

└── tutorial

    ├── scrapy.cfg

    └── tutorial

        ├── __init__.py

        ├── items.py

        ├── pipelines.py

        ├── settings.py

        └── spiders

            └── __init__.py

看完上述內容是否對您有幫助呢?如果還想對相關知識有進一步的了解或閱讀更多相關文章,請關注億速云行業資訊頻道,感謝您對億速云的支持。

向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

乌什县| 三都| 周口市| 孝义市| 江门市| 开江县| 色达县| 乐陵市| 乌拉特后旗| 读书| 谢通门县| 洪湖市| 广灵县| 潼关县| 建始县| 申扎县| 延庆县| 厦门市| 响水县| 宁阳县| 临湘市| 盐亭县| 宁武县| 金门县| 孟州市| 安丘市| 钟山县| 新野县| 睢宁县| 万全县| 嘉荫县| 南通市| 固阳县| 墨脱县| 佛冈县| 微山县| 柞水县| 崇阳县| 武强县| 伊川县| 苏州市|