Hi everyone,
Recently i was doing a project regarding question generation using gpt2 model.I have used gpt2 small model.For generating question i was using the decode generator function:
- decoded_sequences = model.generate(
-
input_ids=input_ids,
-
attention_mask=attention_mask,
-
max_length=origin_seq_len + MAX_QUESTION_SPACE,
-
min_length=origin_seq_len + MIN_QUESTION_SPACE,
-
pad_token_id=0,
-
bos_token_id=1,
-
eos_token_id=2,
-
do_sample=True,
-
num_beams=5,
-
repetition_penalty=1.3,
-
no_repeat_ngram_size=3,
-
num_return_sequences=3,
-
)
After using the ctranslate2 i am able to convert the model but when i trying to generate question i am getting this error:
-
TypeError: generate_batch(): incompatible function arguments. The following argument types are supported:
-
1. (self: ctranslate2._ext.Generator, start_tokens: List[List[str]], *, max_batch_size: int = 0, batch_type: str = 'examples', asynchronous: bool = False, beam_size: int = 1, patience: float = 1, num_hypotheses: int = 1, length_penalty: float = 1, repetition_penalty: float = 1, no_repeat_ngram_size: int = 0, disable_unk: bool = False, suppress_sequences: Optional[List[List[str]]] = None, end_token: Optional[str] = None, max_length: int = 512, min_length: int = 0, return_scores: bool = False, return_alternatives: bool = False, min_alternative_expansion_prob: float = 0, sampling_topk: int = 1, sampling_temperature: float = 1) -> Union[List[ctranslate2._ext.GenerationResult], List[ctranslate2._ext.AsyncGenerationResult]]
-
Invoked with: <ctranslate2._ext.Generator object at 0x7f5138bf8d30>, [88, 26560, 4215, 13670, 3809, 10214, 460, 3031, 351, 262, 749, 5035, 290, 41484, 25, 25652, 25]; kwargs: max_length=50, batch_type='tokens
Thanks!