Skip to content

Commit

Permalink
fix quote
Browse files Browse the repository at this point in the history
  • Loading branch information
wnma3mz committed Dec 7, 2023
1 parent 896bfae commit 43c75a0
Show file tree
Hide file tree
Showing 26 changed files with 5,660 additions and 1,997 deletions.
9 changes: 5 additions & 4 deletions 2017/04/09/Hadoop伪分布式安装(Centos7)/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<meta property="og:locale">
<meta property="og:image" content="https://raw.githubusercontent.com/wnma3mz/blog_posts/master/imgs/hadoop/wKiom1SFSNnAeaEUAADD3ZcjTjw828.jpg">
<meta property="article:published_time" content="2017-04-09T09:42:00.000Z">
<meta property="article:modified_time" content="2023-12-05T01:45:47.648Z">
<meta property="article:modified_time" content="2023-12-07T01:19:31.255Z">
<meta property="article:author" content="wnma3mz">
<meta property="article:tag" content="Linux">
<meta property="article:tag" content="Hadoop">
Expand Down Expand Up @@ -233,7 +233,7 @@ <h1 class="post-title" itemprop="name headline">
<i class="far fa-calendar-check"></i>
</span>
<span class="post-meta-item-text">更新于</span>
<time title="修改時間:2023-12-05 09:45:47" itemprop="dateModified" datetime="2023-12-05T09:45:47+08:00">2023-12-05</time>
<time title="修改時間:2023-12-07 09:19:31" itemprop="dateModified" datetime="2023-12-07T09:19:31+08:00">2023-12-07</time>
</span>
<span class="post-meta-item">
<span class="post-meta-item-icon">
Expand Down Expand Up @@ -286,7 +286,8 @@ <h3 id="hdfs启动与停止">hdfs启动与停止</h3>
<h3 id="配置和启动yarn">配置和启动YARN</h3>
<figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment">#切换配置文件目录</span></span><br><span class="line"><span class="built_in">cd</span> /usr/<span class="built_in">local</span>/hadoop-2.7.3/etc/hadoop/</span><br><span class="line">mv mapred-site.xml.template mapred-site.xml</span><br><span class="line">vim mapred-site.xml</span><br><span class="line"><span class="comment">#添加&lt;configuration&gt;&lt;/configuration&gt;中间的配置</span></span><br><span class="line">&lt;configuration&gt;</span><br><span class="line"> &lt;!-- 通知框架MR使用YARN --&gt;</span><br><span class="line"> &lt;property&gt;</span><br><span class="line"> &lt;name&gt;mapreduce.framework.name&lt;/name&gt;</span><br><span class="line"> &lt;value&gt;yarn&lt;/value&gt;</span><br><span class="line"> &lt;/property&gt;</span><br><span class="line">&lt;/configuration&gt;</span><br><span class="line"><span class="comment">#保存退出</span></span><br><span class="line"></span><br><span class="line">vim yarn-site.xml</span><br><span class="line"><span class="comment">#添加如上配置</span></span><br><span class="line">&lt;configuration&gt;</span><br><span class="line"> &lt;!-- reducer取数据的方式是mapreduce_shuffle --&gt;</span><br><span class="line"> &lt;property&gt;</span><br><span class="line"> &lt;name&gt;yarn.nodemanager.aux-services&lt;/name&gt;</span><br><span class="line"> &lt;value&gt;mapreduce_shuffle&lt;/value&gt;</span><br><span class="line"> &lt;/property&gt;</span><br><span class="line">&lt;/configuration&gt;</span><br><span class="line"><span class="comment">#保存退出</span></span><br><span class="line"></span><br><span class="line"><span class="comment">#启动YARN</span></span><br><span class="line">start-yarn.sh</span><br><span class="line"><span class="comment">#输入jps检查是否成功</span></span><br><span class="line">jps</span><br><span class="line"><span class="comment">#出现六行顺序不定,分别为SecondaryNameNode、DataNode、NameNode、Jps、ResourceManager、NodeManager,即表示成功</span></span><br><span class="line"><span class="comment">#打开浏览器输入:服务器ip:8088</span></span><br><span class="line"><span class="comment">#若无响应,可参照上文中的开放50070端口,开放8088端口,只需将50070换为8088即可</span></span><br></pre></td></tr></table></figure>
<h2 id="测试">测试</h2>
<p>在本地新建一个文件,如在<code>/home/user/</code>下新建<code>words.txt</code>,内容如下 <figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br></pre></td><td class="code"><pre><span class="line">hello world</span><br><span class="line">hello hadoop</span><br><span class="line">hello csdn</span><br><span class="line">hello</span><br></pre></td></tr></table></figure> 正式进行测试。命令如下: <figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment">#在hdfs根目录下新建test目录</span></span><br><span class="line">bin/hdfs dfs -mkdir /<span class="built_in">test</span></span><br><span class="line"></span><br><span class="line"><span class="comment">#查看hdfs根目录下的目录结构</span></span><br><span class="line">bin/hdfs dfs -ls /</span><br><span class="line"></span><br><span class="line"><span class="comment">#将本地文件上传至/test/目录下</span></span><br><span class="line">bin/hdfs dfs -put /home/user/words.txt /<span class="built_in">test</span>/</span><br><span class="line"></span><br><span class="line"><span class="comment">#运行wordcount</span></span><br><span class="line">bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /<span class="built_in">test</span>/words.txt /<span class="built_in">test</span>/out</span><br><span class="line"></span><br><span class="line"><span class="comment">#在/test/目录下生成了一个名为out的文件目录,查看一下/out/目录下的文件</span></span><br><span class="line">bin/hdfs dfs -ls /<span class="built_in">test</span>/out</span><br><span class="line"></span><br><span class="line"><span class="comment">#结果保存在part-r-00000,查看一下运行结果</span></span><br><span class="line">bin/hdfs fs -cat /<span class="built_in">test</span>/out/part-r-00000</span><br></pre></td></tr></table></figure> ## HDFS的常用操作命令</p>
<p>在本地新建一个文件,如在<code>/home/user/</code>下新建<code>words.txt</code>,内容如下 <figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br></pre></td><td class="code"><pre><span class="line">hello world</span><br><span class="line">hello hadoop</span><br><span class="line">hello csdn</span><br><span class="line">hello</span><br></pre></td></tr></table></figure> 正式进行测试。命令如下: <figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment">#在hdfs根目录下新建test目录</span></span><br><span class="line">bin/hdfs dfs -mkdir /<span class="built_in">test</span></span><br><span class="line"></span><br><span class="line"><span class="comment">#查看hdfs根目录下的目录结构</span></span><br><span class="line">bin/hdfs dfs -ls /</span><br><span class="line"></span><br><span class="line"><span class="comment">#将本地文件上传至/test/目录下</span></span><br><span class="line">bin/hdfs dfs -put /home/user/words.txt /<span class="built_in">test</span>/</span><br><span class="line"></span><br><span class="line"><span class="comment">#运行wordcount</span></span><br><span class="line">bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /<span class="built_in">test</span>/words.txt /<span class="built_in">test</span>/out</span><br><span class="line"></span><br><span class="line"><span class="comment">#在/test/目录下生成了一个名为out的文件目录,查看一下/out/目录下的文件</span></span><br><span class="line">bin/hdfs dfs -ls /<span class="built_in">test</span>/out</span><br><span class="line"></span><br><span class="line"><span class="comment">#结果保存在part-r-00000,查看一下运行结果</span></span><br><span class="line">bin/hdfs fs -cat /<span class="built_in">test</span>/out/part-r-00000</span><br></pre></td></tr></table></figure></p>
<h2 id="hdfs的常用操作命令">HDFS的常用操作命令</h2>
<figure class="highlight bash"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment">#常用操作:</span></span><br><span class="line"><span class="comment">#HDFS shell</span></span><br><span class="line"><span class="comment">#查看帮助</span></span><br><span class="line">hadoop fs -<span class="built_in">help</span> &lt;cmd&gt;</span><br><span class="line"><span class="comment">#上传</span></span><br><span class="line">hadoop fs -cat &lt;hdfs上的路径&gt;</span><br><span class="line"><span class="comment">#查看文件列表</span></span><br><span class="line">hadoop fs -ls /</span><br><span class="line"><span class="comment">#下载文件</span></span><br><span class="line">hadoop fs -get &lt;hdfs上的路径&gt; &lt;linux上文件&gt;</span><br></pre></td></tr></table></figure>
<p>下一篇:<a href="/hexo_blog/2017/05/07/%E5%9C%A8Hadoop%E5%9F%BA%E7%A1%80%E4%B8%8AHive%E7%9A%84%E5%AE%89%E8%A3%85/" title="在Hadoop基础上Hive的安装">在Hadoop基础上Hive的安装</a></p>

Expand Down Expand Up @@ -391,7 +392,7 @@ <h2 id="测试">测试</h2>

<!--noindex-->
<div class="post-toc-wrap sidebar-panel">
<div class="post-toc motion-element"><ol class="nav"><li class="nav-item nav-level-2"><a class="nav-link" href="#%E6%96%87%E7%AB%A0%E5%BC%80%E5%A4%B4"><span class="nav-text">文章开头</span></a></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E5%AE%89%E8%A3%85%E4%B9%8B%E5%89%8D"><span class="nav-text">安装之前</span></a></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E8%A7%A3%E5%8E%8B%E7%BC%A9%E5%8F%8A%E9%85%8D%E7%BD%AE"><span class="nav-text">解压缩及配置</span></a><ol class="nav-child"><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEjava%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F%E5%8F%AF%E4%B8%8E%E9%85%8D%E7%BD%AEhadoop%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F%E9%85%8D%E7%BD%AE%E4%B8%80%E8%B5%B7%E8%BF%9B%E8%A1%8C"><span class="nav-text">配置Java环境变量(可与配置Hadoop环境变量配置一起进行)</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEhadoop%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F"><span class="nav-text">配置Hadoop环境变量</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AE%E5%90%AF%E5%8A%A8hadoop"><span class="nav-text">配置启动Hadoop</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEssh%E5%85%8D%E5%AF%86%E7%A0%81%E7%99%BB%E9%99%86%E5%AF%86%E7%A0%81%E4%BA%92%E9%80%9A"><span class="nav-text">配置SSH免密码登陆(密码互通)</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#hdfs%E5%90%AF%E5%8A%A8%E4%B8%8E%E5%81%9C%E6%AD%A2"><span class="nav-text">hdfs启动与停止</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AE%E5%92%8C%E5%90%AF%E5%8A%A8yarn"><span class="nav-text">配置和启动YARN</span></a></li></ol></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E6%B5%8B%E8%AF%95"><span class="nav-text">测试</span></a></li></ol></div>
<div class="post-toc motion-element"><ol class="nav"><li class="nav-item nav-level-2"><a class="nav-link" href="#%E6%96%87%E7%AB%A0%E5%BC%80%E5%A4%B4"><span class="nav-text">文章开头</span></a></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E5%AE%89%E8%A3%85%E4%B9%8B%E5%89%8D"><span class="nav-text">安装之前</span></a></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E8%A7%A3%E5%8E%8B%E7%BC%A9%E5%8F%8A%E9%85%8D%E7%BD%AE"><span class="nav-text">解压缩及配置</span></a><ol class="nav-child"><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEjava%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F%E5%8F%AF%E4%B8%8E%E9%85%8D%E7%BD%AEhadoop%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F%E9%85%8D%E7%BD%AE%E4%B8%80%E8%B5%B7%E8%BF%9B%E8%A1%8C"><span class="nav-text">配置Java环境变量(可与配置Hadoop环境变量配置一起进行)</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEhadoop%E7%8E%AF%E5%A2%83%E5%8F%98%E9%87%8F"><span class="nav-text">配置Hadoop环境变量</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AE%E5%90%AF%E5%8A%A8hadoop"><span class="nav-text">配置启动Hadoop</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AEssh%E5%85%8D%E5%AF%86%E7%A0%81%E7%99%BB%E9%99%86%E5%AF%86%E7%A0%81%E4%BA%92%E9%80%9A"><span class="nav-text">配置SSH免密码登陆(密码互通)</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#hdfs%E5%90%AF%E5%8A%A8%E4%B8%8E%E5%81%9C%E6%AD%A2"><span class="nav-text">hdfs启动与停止</span></a></li><li class="nav-item nav-level-3"><a class="nav-link" href="#%E9%85%8D%E7%BD%AE%E5%92%8C%E5%90%AF%E5%8A%A8yarn"><span class="nav-text">配置和启动YARN</span></a></li></ol></li><li class="nav-item nav-level-2"><a class="nav-link" href="#%E6%B5%8B%E8%AF%95"><span class="nav-text">测试</span></a></li><li class="nav-item nav-level-2"><a class="nav-link" href="#hdfs%E7%9A%84%E5%B8%B8%E7%94%A8%E6%93%8D%E4%BD%9C%E5%91%BD%E4%BB%A4"><span class="nav-text">HDFS的常用操作命令</span></a></li></ol></div>
</div>
<!--/noindex-->

Expand Down
4 changes: 2 additions & 2 deletions 2017/05/07/在Hadoop基础上Hive的安装/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@
<meta property="article:modified_time" content="2023-12-05T01:50:48.963Z">
<meta property="article:author" content="wnma3mz">
<meta property="article:tag" content="Hadoop">
<meta property="article:tag" content="Hive">
<meta property="article:tag" content="大数据">
<meta property="article:tag" content="Hive">
<meta name="twitter:card" content="summary">
<meta name="twitter:image" content="https://raw.githubusercontent.com/wnma3mz/blog_posts/master/imgs/hadoop/20170612195815151.png">

Expand Down Expand Up @@ -369,8 +369,8 @@ <h3 id="转换类型">转换类型</h3>
<footer class="post-footer">
<div class="post-tags">
<a href="/hexo_blog/tags/Hadoop/" rel="tag"># Hadoop</a>
<a href="/hexo_blog/tags/Hive/" rel="tag"># Hive</a>
<a href="/hexo_blog/tags/%E5%A4%A7%E6%95%B0%E6%8D%AE/" rel="tag"># 大数据</a>
<a href="/hexo_blog/tags/Hive/" rel="tag"># Hive</a>
</div>


Expand Down
Loading

0 comments on commit 43c75a0

Please sign in to comment.