The authorization framework in Apache Hive 1.0.0, 1.0.1, 1.1.0, 1.1.1, 1.2.0 and 1.2.1, on clusters protected by Ranger and SqlStdHiveAuthorization, allows attackers to bypass intended parent table access restrictions via unspecified partition-level operations.
Apache Hive before 0.13.1, when in SQL standards based authorization mode, does not properly check the file permissions for (1) import and (2) export statements, which allows remote authenticated users to obtain sensitive information via a crafted URI.
The Hive EXPLAIN operation does not check for necessary authorization of involved entities in a query. An unauthorized user can do EXPLAIN on arbitrary table or view and expose table metadata and statistics.
In Apache Hive, local resources on HiveServer2 machines are not properly protected against a malicious user if ranger, sentry or sql standard authorizer is not in use.
In Apache Hiveto, malicious user might use any xpath UDFs to expose the content of a file on the machine running HiveServer2 owned by HiveServer2 user (usually hive) if hive.serverenable.doAs=false.
In Apache Hiveto, when COPY FROM FTP statement is run using HPL/SQL extension to Hive, a compromised/malicious FTP server can cause the file to be written to an arbitrary location on the cluster where the command is run from. This is because FTP client code in HPL/SQL does not verify the destination location of the downloaded file. This does not affect hive cli user and hiveserver2 user as hplsql is …