Error: Error while compiling statement: [Error 10308]: Attempt to acquire compile lock timed out. (state=,code=10308) org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: [Error 10308]: Attempt to acquire compile lock timed out. at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:241) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:227) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:255) at org.apache.hive.beeline.Commands.executeInternal(Commands.java:989) at org.apache.hive.beeline.Commands.execute(Commands.java:1180) at org.apache.hive.beeline.Commands.sql(Commands.java:1094) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1180) at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:1013) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:922) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:518) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:501) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:226) at org.apache.hadoop.util.RunJar.main(RunJar.java:141) Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: [Error 10308]: Attempt to acquire compile lock timed out. at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:400) at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:187) at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:271) at org.apache.hive.service.cli.operation.Operation.run(Operation.java:337) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:439) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:416) at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:282) at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:503) at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313) at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-pipes: An Ant BuildException has occured: exec returned: 1[ERROR] around Ant part …<exec dir=”/Volumes/Extended/Users/msh/IdealProjects/Git-Study/hadoop/hadoop-tools/hadoop-pipes/target/native” executable=”cmake” failonerror=”true”>… @ 5:153 in /Volumes/Extended/Users/msh/IdealProjects/Git-Study/hadoop/hadoop-tools/hadoop-pipes/target/antrun/build-main.xml
猜测是ant版本问题,重装了jdk1.7适配的ant。
Ant BuildException也是够迷惑的。而且之前猴子电脑配置的jdk1.8,切到1.7之后ant就不能用了(brew安装的ant用1.8jdk编译的,1.7无法解析class文件),重装了适配1.7的ant版本后,ant可以正常使用了,却还是报这个错。。。
[ERROR] Failed to execute goal on project hadoop-aws: Could not resolve dependencies for project org.apache.hadoop:hadoop-aws:jar:2.6.0: Could not transfer artifact com.amazonaws:aws-java-sdk:jar:1.7.4 from/to central (https://repo.maven.apache.org/maven2): GET request of: com/amazonaws/aws-java-sdk/1.7.4/aws-java-sdk-1.7.4.jar from central failed: SSL peer shut down incorrectly -> [Help 1]
出现类似“Could not resolve dependencies”、“SSL peer shut down incorrectly”等语句,一般是maven不稳定,换个稳定的maven源,或者重新编译多试几次。
$ mv /usr/bin/openssl /usr/bin/openssl_old
mv: rename /usr/bin/openssl to /usr/bin/openssl_old: Operation not permitted
$ ln -s /usr/local/Cellar/openssl/1.0.2p/bin/openssl /usr/bin/openssl
ln: /usr/bin/openssl: Operation not permitted
Operation not permitted提示没有权限操作,对/usr/bin目录下的东西,我已经遇到过几次这个问题了,于是继续google,在stackoverflow上找到了Operation Not Permitted when on root El capitan (rootless disabled)。
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-auth: There are test failures.[ERROR][ERROR] Please refer to /Volumes/Extended/Users/msh/IdealProjects/Git-Study/hadoop/hadoop-common-project/hadoop-auth/target/surefire-reports for the individual test results. Results : Tests in error:TestKerberosAuthenticator.testAuthenticationHttpClientPost:157 » ClientProtocolTestKerberosAuthenticator.testAuthenticationHttpClientPost:157 » ClientProtocol Tests run: 92, Failures: 0, Errors: 2, Skipped: 0
/** 用户行为数据结构 **/ public static class UserBehavior { public long userId; // 用户ID public long itemId; // 商品ID public int categoryId; // 商品类目ID public String behavior; // 用户行为, 包括(“pv”, “buy”, “cart”, “fav”) public long timestamp; // 行为发生的时间戳,单位秒 }
我们使用.keyBy("itemId")对商品进行分组,使用.timeWindow(Time size, Time slide)对每个商品做滑动窗口(1小时窗口,5分钟滑动一次)。然后我们使用 .aggregate(AggregateFunction af, WindowFunction wf) 做增量的聚合操作,它能使用AggregateFunction提前聚合掉数据,减少 state 的存储压力。较之.apply(WindowFunction wf)会将窗口中的数据都存储下来,最后一起计算要高效地多。aggregate()方法的第一个参数用于
/** 商品点击量(窗口操作的输出类型) */ public static class ItemViewCount { public long itemId; // 商品ID public long windowEnd; // 窗口结束时间戳 public long viewCount; // 商品的点击量
public static ItemViewCount of(long itemId, long windowEnd, long viewCount) { ItemViewCount result = new ItemViewCount(); result.itemId = itemId; result.windowEnd = windowEnd; result.viewCount = viewCount; return result; } }
ASF Member 相当于是基金会的“股东”,有董事会选举的投票权,也可以参与董事会竞选。ASF Member 也有权利决定是否接受一个新项目,主要关注 Apache 基金会本身的发展。ASF Member 通常要从 Contributor, Committer 等这些角色起步,逐步通过行动证明自己后,才可能被接受成为ASF Member。