You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm following the instructions in the readme and am stuck on Step 8.
When I run df = spark.table("demo.rpc.pizzas") I get the following error:
>>> df = spark.table("demo.rpc.pizzas")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/spark/python/pyspark/sql/session.py", line 1667, in table
return DataFrame(self._jsparkSession.table(tableName), self)
File "/opt/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1322, in __call__
File "/opt/spark/python/pyspark/errors/exceptions/captured.py", line 185, in deco
raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `demo`.`rpc`.`pizzas` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.;
'UnresolvedRelation [demo, rpc, pizzas], [], false
Is there a step missing in the demo where it was supposed to create the catalog and the table?
The text was updated successfully, but these errors were encountered:
I'm following the instructions in the readme and am stuck on Step 8.
When I run
df = spark.table("demo.rpc.pizzas")
I get the following error:Is there a step missing in the demo where it was supposed to create the catalog and the table?
The text was updated successfully, but these errors were encountered: