How much Memory is used by MGMTDB from Grid Infrastrutcture Managment Repository (GIMR)?
Answer:
memory_max_target big integer 1536M
memory_target big integer 1536M
Big Data Persistence and Analytics Blog
How much Memory is used by MGMTDB from Grid Infrastrutcture Managment Repository (GIMR)?
Answer:
memory_max_target big integer 1536M
memory_target big integer 1536M
The Problem:
After upgrade application libraries (spring 3.2 to 4.2, mahout, hbase 0.98 to 1.1, ibatis 2 to mybatis, …..) the application was not able to connect to HBase anymore. After adding and excluding some the missing and conflicting dependencies it was still stuck.
The Debuging:
Narrow down the problem I decided to use a simple test application
Setup test table:
bin/hbase shell
hbase(main):001:0> create ‘test’, ‘cf’
0 row(s) in 3.8890 seconds
hbase(main):002:0> put ‘test’, ‘row1’, ‘cf:a’, ‘value1’
0 row(s) in 0.1840 seconds
Demo Application:
import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.HTable; import org.apache.hadoop.hbase.client.Get; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.ResultScanner; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.util.Bytes; public class HBaseClient { public static void main(String[] arg) throws IOException { Configuration config = HBaseConfiguration.create(); config.set("hbase.zookeeper.quorum", "127.0.0.1"); config.set("hbase.zookeeper.property.clientPort", "2182"); config.set("hbase.hconnection.threads.core","1"); config.set("hbase.hconnection.threads.max","10"); config.set("hbase.client.retries.number","3"); config.set("hbase.rootdir","hdfs://127.0.0.1:9000/hbase"); //config.set("",""); long startTime0 = System.currentTimeMillis(); HTable testTable = new HTable(config, "test"); long stopTime0 = System.currentTimeMillis(); System.out.println("connect in "+(stopTime0 - startTime0)+" ms"); long startTime = System.currentTimeMillis(); for (int i = 0; i < 1; i++) { byte[] family = Bytes.toBytes("cf"); byte[] qual = Bytes.toBytes("a"); Scan scan = new Scan(); scan.addColumn(family, qual); scan.setMaxResultsPerColumnFamily(5); ResultScanner rs = testTable.getScanner(scan); for (Result r = rs.next(); r != null; r = rs.next()) { byte[] valueObj = r.getValue(family, qual); String value = new String(valueObj); System.out.println(value); } } long stopTime = System.currentTimeMillis(); System.out.println("Scan finished in "+(stopTime - startTime)+" ms"); long startTime2 = System.currentTimeMillis(); Get g=new Get("row1".getBytes()); Result rs = testTable.get(g); long stopTime2 = System.currentTimeMillis(); System.out.println("Get finished in "+(stopTime2 - startTime2)+" ms"); testTable.close(); } }
(Sorry about the unessary speed counters – I’m a bit a performance freak)
Now I just added the libraries needed until it worked. Then I added more util didn’t work again.
The Solution:
Exception in thread "main" java.lang.NoSuchFieldError: IBM_JAVA at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:339) at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:384)
hadoop-core-1.2.1.jar and hadoop-auth-2.5.1.jar are conflicting remove hadoop-core from your classpath
Example to exclude it in Maven from mahout:
<dependency> <groupId>org.apache.mahout</groupId> <artifactId>mahout</artifactId> <version>0.8</version> <exclusions> <exclusion> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-core</artifactId> </exclusion> </exclusions> </dependency>