AI Fabric Pipeline Failed

I am not able to Train my file in AI Fabric Text Classifcation .The training pipeline is getting failed.
When i use the (Movie Review)files shared in Uipath academy assignment then it is working but when i am using a new csv file and adding my own input then it doesnt seem to work. (309 Bytes)

Logs error
er training-cf2256ba-b508-4614-849d-832baf90195c/e54a0065-2f74-47e7-b7d7-a58a8045c01c/450e108b-6e6a-4256-9497-1ee022ccdc45 with size 1 downloaded successfully
2020-07-31 11:11:21,245 - wrapper.utils:_retries:20 - INFO: Total time taken to execute func: download : 0.22377943992614746
2020-07-31 11:11:21,246 - wrapper.training_wrapper:train_model:91 - INFO: Start model training…
2020-07-31 11:11:21,246 - wrapper.training_wrapper:initialize_model:85 - INFO: Start model initialization…
2020-07-31 11:11:21,247 - wrapper.training_wrapper:initialize_model:88 - INFO: Model initialized successfully
2020-07-31 11:11:21,247 - aiflib.data_manager:info:15 - INFO: Loading data from /data/dataset…
2020-07-31 11:11:21,253 - aiflib.data_manager:info:15 - INFO: Unable to read any valid data from *.json files in [/data/dataset]
2020-07-31 11:11:21,254 - aiflib.data_manager:info:15 - INFO: Done read [13] points with [1] classes:
2020-07-31 11:11:21,254 - aiflib.data_manager:info:15 - INFO: Data must have at least 2 classes.
2020-07-31 11:11:21,254 - wrapper.training_wrapper:run:140 - ERROR: Training failed for pipeline type: TRAIN_ONLY, error: No valid data to run this pipeline.
2020-07-31 11:11:21,299 - main:main:78 - ERROR: Training Job failed, error: No valid data to run this pipeline.
Traceback (most recent call last):
File “”, line 73, in main

As in the log, the data must have at least 2 classes (positive and negative) but you have not provided any data for the negative class. Also for better training i would recommend to add more data points.
SO the solution is that add add at least 10 datapoints for each class - negative and positive. And mark it as solution if this works.

1 Like

I‘m facing the same issue. Does your problem solved?