Nikhil Kumar
- Total activity 34
- Last activity
- Member since
- Following 0 users
- Followed by 0 users
- Votes 3
- Subscriptions 12
Activity overview
Latest activity by Nikhil Kumar-
Nikhil Kumar commented,
Hi Brian - GROUPCONCAT takes an additional argument which is the column you want to use to order the concatenation. The following help page actually uses a timestamp to order the concat using the s...
-
Nikhil Kumar created a post,
Get workbook metadata for multiple workbooks owned by a particular user.
I have a customer who is running version 6.1.25. They are looking to export metadata from a workbook to a metadata management tool. I gave them our workbook REST API call to export workbook info to...
-
Nikhil Kumar commented,
Hi Joel - Here is the job trace: 2017-01-12 04:00:01,163 INFO hs.JobHistoryServer (LogAdapter.java:info(45)) - STARTUP_MSG: /************************************************************ STARTUP...
-
Nikhil Kumar commented,
Hi Joel - I have requested Amir for the Job Trace. Here is the conductor.log he sent me. [system] INFO [2017-01-13 03:24:41.308] [MrPlanRunnerV2] (Logging.scala:54) - Source and destination file...
-
Nikhil Kumar created a post,
Download Data with different quote character
Hi - I have a partner who is using Datameer REST APIs to download the data from a workbook to integrate it with a third party product. However, the other product does not have the ability to parse ...
-
Nikhil Kumar created a post,
Cannot parse a date in another time zone
Hi team - I have a partner in Egypt who is doing a File Upload of Credit Card transactions from our Fraud Analytics App on our App Market. When I do the same File Upload, I get no errors or records...
-
Nikhil Kumar commented,
That worked! Thanks os much Konsta!!
-
Nikhil Kumar commented,
The setting tez.task.resource.memory.mb=1536 did not resolve the problem by the way.
-
Nikhil Kumar commented,
Spark Client sorry. Not Spark cluster
-
Nikhil Kumar commented,
Thanks Kosta - Since this is 6.1.2, it using Spark cluster mode. So I think there is a spark specific setting I need?