java.nio.channels.SocketChannel[connection-pending remote=/xx.xx.xx.xx:9866]

2023-10-10 12:52

本文主要是介绍java.nio.channels.SocketChannel[connection-pending remote=/xx.xx.xx.xx:9866],希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

目录

背景

问题描述

解决办法


背景

CDH集群在内网中部署,外网客户端需要正常提交任务到内网集群Yarn上,但外网客户端和内网网络不能直接连通,于是通过将内网中的每台主机绑定一个浮动ip,然后开通外网客户端和浮动ip之间的网络来实现上述需求。

问题描述

外网客户端通过连接浮动ip来提交任务到内网集群,任务提交到Yarn之后,集群返回响应内容给客户端,但响应内容中涉及的节点信息均为内网ip,导致客户端无法连接。具体报错如下:

[INFO] 2023-09-20 16:44:50.515  - [taskAppId=TASK-12637-0-7787]:[138] -  -> 2023-09-20 16:44:49,952 INFO  org.apache.hadoop.hdfs.DataStreamer                          [] - Exception in createBlockOutputStreamorg.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.17.0.8:9866]at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]2023-09-20 16:44:49,964 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Abandoning BP-1309512692-172.17.0.6-1691719706686:blk_1073803089_622802023-09-20 16:44:49,980 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Excluding datanode DatanodeInfoWithStorage[172.17.0.8:9866,DS-961a5b2e-c2a1-46a3-bfdd-3910d2570bb3,DISK]
[INFO] 2023-09-20 16:45:50.524  - [taskAppId=TASK-12637-0-7787]:[138] -  -> 2023-09-20 16:45:50,043 INFO  org.apache.hadoop.hdfs.DataStreamer                          [] - Exception in createBlockOutputStreamorg.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.17.0.6:9866]at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]2023-09-20 16:45:50,044 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Abandoning BP-1309512692-172.17.0.6-1691719706686:blk_1073803091_622822023-09-20 16:45:50,053 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Excluding datanode DatanodeInfoWithStorage[172.17.0.6:9866,DS-3a03d2ae-c218-44f6-80b6-253cb6ada508,DISK]
[INFO] 2023-09-20 16:46:50.415  - [taskAppId=TASK-12637-0-7787]:[127] - shell exit status code:1
[ERROR] 2023-09-20 16:46:50.415  - [taskAppId=TASK-12637-0-7787]:[137] - process has failure , exitStatusCode : 1 , ready to kill ...
[INFO] 2023-09-20 16:46:50.534  - [taskAppId=TASK-12637-0-7787]:[138] -  -> 2023-09-20 16:46:50,083 INFO  org.apache.hadoop.hdfs.DataStreamer                          [] - Exception in createBlockOutputStreamorg.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.17.0.4:9866]at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]2023-09-20 16:46:50,084 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Abandoning BP-1309512692-172.17.0.6-1691719706686:blk_1073803093_622842023-09-20 16:46:50,091 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - Excluding datanode DatanodeInfoWithStorage[172.17.0.4:9866,DS-5363866a-d143-42f7-85bb-a8236e0bbc41,DISK]2023-09-20 16:46:50,105 WARN  org.apache.hadoop.hdfs.DataStreamer                          [] - DataStreamer Exceptionorg.apache.hadoop.ipc.RemoteException: File /user/hdfs/.flink/application_1691720545069_0007/chunjun/bin/chunjun-docker.sh could only be written to 0 of the 1 minReplication nodes. There are 3 datanode(s) running and 3 node(s) are excluded in this operation.at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2102)at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:294)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2673)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:872)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:550)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869)at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1489) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.ipc.Client.call(Client.java:1435) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.ipc.Client.call(Client.java:1345) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at com.sun.proxy.$Proxy30.addBlock(Unknown Source) ~[?:?]at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:444) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_211]at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_211]at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_211]at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_211]at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:409) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:346) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at com.sun.proxy.$Proxy31.addBlock(Unknown Source) ~[?:?]at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1838) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1638) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704) [flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0]2023-09-20 16:46:50,112 ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Error while running the Flink session.

解决办法

  • 思路1

客户端配置主机映射,将内网ip映射为浮动ip,经过尝试,该方案不可行。

  • 思路2

修改HDFS配置

  <property><name>dfs.clientuse.datanode.hostname</name><value>true</value></property>

这篇关于java.nio.channels.SocketChannel[connection-pending remote=/xx.xx.xx.xx:9866]的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/180575

相关文章

如何在 Spring Boot 中实现 FreeMarker 模板

《如何在SpringBoot中实现FreeMarker模板》FreeMarker是一种功能强大、轻量级的模板引擎,用于在Java应用中生成动态文本输出(如HTML、XML、邮件内容等),本文... 目录什么是 FreeMarker 模板?在 Spring Boot 中实现 FreeMarker 模板1. 环

SpringMVC 通过ajax 前后端数据交互的实现方法

《SpringMVC通过ajax前后端数据交互的实现方法》:本文主要介绍SpringMVC通过ajax前后端数据交互的实现方法,本文给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价... 在前端的开发过程中,经常在html页面通过AJAX进行前后端数据的交互,SpringMVC的controll

Java中的工具类命名方法

《Java中的工具类命名方法》:本文主要介绍Java中的工具类究竟如何命名,本文给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友参考下吧... 目录Java中的工具类究竟如何命名?先来几个例子几种命名方式的比较到底如何命名 ?总结Java中的工具类究竟如何命名?先来几个例子JD

Java Stream流使用案例深入详解

《JavaStream流使用案例深入详解》:本文主要介绍JavaStream流使用案例详解,本文通过实例代码给大家介绍的非常详细,对大家的学习或工作具有一定的参考借鉴价值,需要的朋友参考下吧... 目录前言1. Lambda1.1 语法1.2 没参数只有一条语句或者多条语句1.3 一个参数只有一条语句或者多

Spring Security自定义身份认证的实现方法

《SpringSecurity自定义身份认证的实现方法》:本文主要介绍SpringSecurity自定义身份认证的实现方法,下面对SpringSecurity的这三种自定义身份认证进行详细讲解,... 目录1.内存身份认证(1)创建配置类(2)验证内存身份认证2.JDBC身份认证(1)数据准备 (2)配置依

SpringBoot整合OpenFeign的完整指南

《SpringBoot整合OpenFeign的完整指南》OpenFeign是由Netflix开发的一个声明式Web服务客户端,它使得编写HTTP客户端变得更加简单,本文为大家介绍了SpringBoot... 目录什么是OpenFeign环境准备创建 Spring Boot 项目添加依赖启用 OpenFeig

Java Spring 中 @PostConstruct 注解使用原理及常见场景

《JavaSpring中@PostConstruct注解使用原理及常见场景》在JavaSpring中,@PostConstruct注解是一个非常实用的功能,它允许开发者在Spring容器完全初... 目录一、@PostConstruct 注解概述二、@PostConstruct 注解的基本使用2.1 基本代

springboot使用Scheduling实现动态增删启停定时任务教程

《springboot使用Scheduling实现动态增删启停定时任务教程》:本文主要介绍springboot使用Scheduling实现动态增删启停定时任务教程,具有很好的参考价值,希望对大家有... 目录1、配置定时任务需要的线程池2、创建ScheduledFuture的包装类3、注册定时任务,增加、删

SpringBoot整合mybatisPlus实现批量插入并获取ID详解

《SpringBoot整合mybatisPlus实现批量插入并获取ID详解》这篇文章主要为大家详细介绍了SpringBoot如何整合mybatisPlus实现批量插入并获取ID,文中的示例代码讲解详细... 目录【1】saveBATch(一万条数据总耗时:2478ms)【2】集合方式foreach(一万条数

IntelliJ IDEA 中配置 Spring MVC 环境的详细步骤及问题解决

《IntelliJIDEA中配置SpringMVC环境的详细步骤及问题解决》:本文主要介绍IntelliJIDEA中配置SpringMVC环境的详细步骤及问题解决,本文分步骤结合实例给大... 目录步骤 1:创建 Maven Web 项目步骤 2:添加 Spring MVC 依赖1、保存后执行2、将新的依赖