val user = sqlContext.load(
"org.apache.phoenix.spark",
Map("table" -> "user", "zkUrl" -> "192.168.159.129:2181")
).rdd.map(x => {
val userName = x.getAs("userName").toString
val gender = x.getAs("gender").toString
(userName,gender)
})
类似于这样读取hbase中的表, hbase表中数据量非常大,该怎么处理?