tag:blogger.com,1999:blog-64604474902503556812024-02-20T01:14:42.233-08:00Blog::: JvmNotFoundExceptionJava,Hadoop,Spark,NoSQLRaj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.comBlogger155125tag:blogger.com,1999:blog-6460447490250355681.post-30347052375498297442019-12-28T17:58:00.000-08:002019-12-28T17:58:02.527-08:00Simple golang webapp on docker for quick test<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/fc1ba2af3cfbf61eb0edec35994c1735.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-7749631774194787572019-08-20T18:08:00.000-07:002019-08-20T18:08:18.294-07:00Apache Hive Client (implemented using GoLang) to connect to hiveserver2<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/ecde1281f7afc1c8566bca5a4c14d20f.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-88216135687482486042019-07-25T12:16:00.000-07:002019-07-25T12:16:07.113-07:00Debugging HiveServer2 Docker container Remotely using Intellij<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/e2a5d02a3a895c7f7e8f1d6d0e71e0e9.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com1tag:blogger.com,1999:blog-6460447490250355681.post-23884601640345107092019-07-25T11:36:00.000-07:002019-07-25T11:36:03.612-07:00Collect Jstack,Jmap and other JVM diagnostics for a JVM running inside the docker container<div dir="ltr" style="text-align: left;" trbidi="on">RUN A hiveserver2 in docker containr (please follow the link http://rajkrrsingh.blogspot.com/2019/07/running-hiveserver2-on-docker.html)<br />
<br />
Get the container id or container name for a running container.<br />
<br />
<code><br />
docker ps -a<br />
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES<br />
0749ae15ece8 hive3-image "hive --service hiveā¦" 20 hours ago Up 20 hours 0.0.0.0:10000->10000/tcp jovial_swartz<br />
<br />
<br />
now you can use container id or the container name to connect with container interactively<br />
<br />
docker exec -it 0749ae15ece8 /bin/bash <br />
[hive@0749ae15ece8 ~]$ pwd<br />
/home/hive<br />
[hive@0749ae15ece8 ~]$ jps<br />
1 RunJar<br />
810 Jps<br />
[hive@0749ae15ece8 ~]$ ps aux | grep hiveserver<br />
hive 1 1.7 21.0 2170608 431380 pts/0 Ssl+ 11:47 6:57 /usr/bin/java -Dproc_jar -Dproc_hiveserver2 -Dlog4j.configurationFile=hive-log4j2.properties -Djava.util.logging.config.file=/grid/apache-hive-3.1.1-bin/conf/parquet-logging.properties -Dyarn.log.dir=/grid/hadoop-3.1.1/logs -Dyarn.log.file=hadoop.log -Dyarn.home.dir=/grid/hadoop-3.1.1 -Dyarn.root.logger=INFO,console -Djava.library.path=/grid/hadoop-3.1.1/lib/native -Xmx256m -Dhadoop.log.dir=/grid/hadoop-3.1.1/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/grid/hadoop-3.1.1 -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /grid/apache-hive-3.1.1-bin/lib/hive-service-3.1.1.jar org.apache.hive.service.server.HiveServer2 --hiveconf datanucleus.schema.autoCreateAll=true --hiveconf hive.metastore.schema.verification=false<br />
hive 828 0.0 0.0 9096 876 pts/1 S+ 18:25 0:00 grep --color=auto hiveserver<br />
[hive@0749ae15ece8 ~]$ jstack -l 1<br />
......<br />
<br />
<br />
jmap -histo:live 1<br />
<br />
....<br />
<br />
<code><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com1tag:blogger.com,1999:blog-6460447490250355681.post-6260527137745633162019-07-24T16:11:00.002-07:002019-07-24T16:12:37.203-07:00Running Hiveserver2 on docker<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/626a4e57af9422abd75a7b48c1118d6d.js"></script></div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-35023714474947015762019-03-25T11:04:00.002-07:002019-03-25T11:04:24.950-07:00Hive Kafka Integration <div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/c021153cb2ee3637e29b7d7e6e36a173.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-37120920512121612142018-07-02T14:08:00.000-07:002018-07-02T14:08:13.910-07:00Quickstart Druid Kafka Indexing Service<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/6960be715a391f3399da417e7e2741b3.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-36868865558270122722018-06-30T22:59:00.000-07:002018-06-30T22:59:45.403-07:00quick-start guide to ingest data into druid using batch mode on HDP platform.<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/b70dd81511fd5ac9d9255579fd079149.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-7677208295109493632018-06-30T20:13:00.000-07:002018-06-30T20:13:03.107-07:00Hive druid Integration : quick test to create druid table from hive table<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/f01475f4bfa4a33240134561171f378f.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-53640041681232763752018-06-27T16:57:00.002-07:002018-06-27T16:57:49.790-07:00Hive Compaction failing with FileAlreadyExistsException<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/94a23b9468742c84f045d7fce4428469.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com2tag:blogger.com,1999:blog-6460447490250355681.post-27685777101822912072018-01-17T14:22:00.002-08:002018-01-17T14:22:40.490-08:00how to create own metastore event listner<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/4016f0a3f02c7f493171d29fb403b62b.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-29562092458868462522017-10-26T02:39:00.000-07:002017-10-26T02:39:08.052-07:00Apache Druid Installation Issue on HDP<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/9b109f00d5f9e815c69b10df8433a926.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-37669502053618030252017-10-06T09:58:00.003-07:002017-10-06T09:58:54.298-07:00Monitoring Kafka Broker JMX using Jolokia JVM Agent <div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/b4378a726d56f43dc5786dc3b73b4918.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-39437938902618692162017-09-05T02:16:00.000-07:002017-09-05T02:16:30.425-07:00Kafka Mirror Maker - from source non-kerberized cluster to kerberized cluster<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/56651ecf2751b0f65208a96f46fa909a.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-72149147255600862102017-08-02T00:18:00.000-07:002017-08-02T00:18:06.040-07:00Creating Custom UDF with LLAP<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/a49b5e79beaae70cf4685d8aaa8efd1e.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-1014689434574497022017-06-18T08:07:00.000-07:002017-06-18T08:07:22.195-07:00Spark Kafka Integration in Kerberized Enviorment<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/0809f155c72194fd69cf1503b68589ac.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-60248689426926273682017-05-27T08:53:00.000-07:002017-05-27T08:53:27.458-07:00Spark LLAP Setup for Spark Thrift Server<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/b37a9d2ae1218181fe1f1a9cfe43bbd7.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-72602138355428297552017-05-21T07:24:00.000-07:002017-05-21T07:24:48.770-07:00Steps to setup kdc before installing kerberos through ambari on hortonworks cluster Raw<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/d0797291b628991d3a47a9ac517a77e6.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-69981723376673434122017-05-18T05:10:00.000-07:002017-05-18T05:11:05.214-07:00How to Configure and Run Storm AutoHDFS plugin (sample Application)<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/aeb4a4f10aaf2d6a6ca9a575c1f9572f.js"></script></div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-81834725752209059472017-05-08T01:40:00.004-07:002017-05-08T01:40:30.177-07:00oozie spark shell action example<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/979c96b00aa3db948db3015a509a449e.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-87535842996411771562017-05-07T23:49:00.000-07:002017-05-07T23:49:23.782-07:00oozie spark action example<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/71f43afaac098428dc614d50ca0293ac.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-25970548956053473112016-11-27T02:50:00.000-08:002016-11-27T02:50:15.523-08:00configure KNOX over HiveServer2 HA on HDP-2.5<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/e79c615d23562b606559ec65e5651e77.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-38987336820494778432016-07-17T23:06:00.000-07:002016-07-17T23:06:15.922-07:00A quick guide to connect HiveServer2 to MySQL DB metastore over SSL<div dir="ltr" style="text-align: left;" trbidi="on"><script src="https://gist.github.com/rajkrrsingh/bfca4e6582fb0ea3fab55992745a74fe.js"></script><br />
</div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-67216299931451507682016-03-20T05:40:00.000-07:002016-03-20T05:40:08.295-07:00<div dir="ltr" style="text-align: left;" trbidi="on">Apache Flink is a streaming data flow engine that provides data distribution, communication, and fault tolerance for distributed computations over data streams. this is a sample application to consume output of vmstat command as a stream, so lets get hands dirty<br />
<pre class="brush:bash">mkdir flink-steaming-example
cd flink-steaming-example/
mkdir -p src/main/scala
cd src/main/scala/
vim FlinkStreamingExample.scala
import org.apache.flink.streaming.api.scala._
object FlinkStreamingExample{
def main(args:Array[String]){
val env = StreamExecutionEnvironment.getExecutionEnvironment
val socketVmStatStream = env.socketTextStream("ip-10-0-0-233",9000);
socketVmStatStream.print
env.execute()
}
}
cd -
vim build.sbt
name := "flink-streaming-examples"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq("org.apache.flink" %% "flink-scala" % "1.0.0",
"org.apache.flink" %% "flink-clients" % "1.0.0",
"org.apache.flink" %% "flink-streaming-scala" % "1.0.0")
sbt clean package
stream some test data
vmstat 1 | nc -l 9000
now submit flink job
bin/flink run /root/flink-steaming-example/target/scala-2.10/flink-streaming-examples_2.10-1.0.jar
03/20/2016 08:04:12 Job execution switched to status RUNNING.
03/20/2016 08:04:12 Source: Socket Stream -> Sink: Unnamed(1/1) switched to SCHEDULED
03/20/2016 08:04:12 Source: Socket Stream -> Sink: Unnamed(1/1) switched to DEPLOYING
03/20/2016 08:04:12 Source: Socket Stream -> Sink: Unnamed(1/1) switched to RUNNING
let see output of the job
tailf flink-1.0.0/log/flink-root-jobmanager-0-ip-10-0-0-233.out -- will see the vmstat output
</pre></div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0tag:blogger.com,1999:blog-6460447490250355681.post-72983342516631739542016-02-25T05:25:00.000-08:002016-02-25T05:25:33.660-08:00Hadoop MapReduce : Enabling JVM Profiling using -XPROF<div dir="ltr" style="text-align: left;" trbidi="on">The -Xprof profiler is the HotSpot profiler. HotSpot works by running Java code in interpreted mode, while running a profiler in parallel. The HotSpot profiler looks for "hot spots" in the code, i.e. methods that the JVM spends a significant amount of time running, and then compiles those methods into native generated code.<br />
-Xprof is very handy to profile mapreduce code, here are few configuration parameters required to turn on profiling in mapreduce using -Xprof<br />
<pre class="brush:bash">mapreduce.task.profile='true'
mapreduce.task.profile.maps='0-'
mapreduce.task.profile.reduces='0-'
mapreduce.task.profile.params='-Xprof'
</pre></div>Raj Kumar Singhhttp://www.blogger.com/profile/10739618929156312164noreply@blogger.com0