Skip to content

Unclear Running Instruction - Was not able to use. #251

Open
@nk3015

Description

@nk3015

None of the models seem to work.
They say either outdated in the case of babbage and davinci or qouta exceeded in the case of 3.5 and 4o.
What do I do?

Activity

tibo-openai

tibo-openai commented on Apr 17, 2025

@tibo-openai
Collaborator

Have you tried running with o4-mini? What error are you seeing as a result?

nk3015

nk3015 commented on Apr 18, 2025

@nk3015
Author

Have you tried running with o4-mini? What error are you seeing as a result?

Hi, that was the model it is set to by default when I first run codex.

On screen it shows,

system
    Warning: model "o4-mini" is not in the list of available models returned by OpenAI.

The output I get from any response is as follows:

file:///C:/Users/User/AppData/Roaming/npm/node_modules/@openai/codex/dist/cli.js:445
`,"\r"]);gg.NEWLINE_REGEXP=/\r\n|[\n\r]/g;function nRe(e,t){for(let a=t??0;a<e.length;a++){if(e[a]===10)return{preceding:a,index:a+1,carriage:!1};if(e[a]===13)return{preceding:a,index:a+1,carriage:!0}}return null}function kae(e){for(let n=0;n<e.length-1;n++){if(e[n]===10&&e[n+1]===10||e[n]===13&&e[n+1]===13)return n+2;if(e[n]===13&&e[n+1]===10&&n+3<e.length&&e[n+2]===13&&e[n+3]===10)return n+4}return-1}g();function PO(e){if(e[Symbol.asyncIterator])return e;let t=e.getReader();return{async next(){try{let r=await t.read();return r?.done&&t.releaseLock(),r}catch(r){throw t.releaseLock(),r}},async return(){let r=t.cancel();return t.releaseLock(),await r,{done:!0,value:void 0}},[Symbol.asyncIterator](){return this}}}var ad=class e{constructor(t,r){this.iterator=t,this.controller=r}static fromSSEResponse(t,r){let n=!1;async function*a(){if(n)throw new Error("Cannot iterate over a consumed stream, use `.tee()` to split the stream.");n=!0;let i=!1;try{for await(let s of aRe(t,r))if(!i){if(s.data.startsWith("[DONE]")){i=!0;continue}if(s.event===null||s.event.startsWith("response.")||s.event.startsWith("transcript.")){let c;try{c=JSON.parse(s.data)}catch(d){throw console.error("Could not parse message into JSON:",s.data),console.error("From chunk:",s.raw),d}if(c&&c.error)throw new mi(void 0,c.error,void 0,UO(t.headers));yield c}else{let c;try{c=JSON.parse(s.data)}catch(d){throw console.error("Could not parse message into JSON:",s.data),console.error("From chunk:",s.raw),d}if(s.event=="error")throw new mi(void 0,c.error,c.message,void 0);yield{event:s.event,data:c}}}i=!0}catch(s){if(s instanceof Error&&s.name==="AbortError")return;throw s}finally{i||r.abort()}}return new e(a,r)}static fromReadableStream(t,r){let n=!1;async function*a(){let s=new gg,c=PO(t);for await(let d of c)for(let f of s.decode(d))yield f;for(let d of s.flush())yield d}async function*i(){if(n)throw new Error("Cannot iterate over a consumed stream, use `.tee()` to split the stream.");n=!0;let s=!1;try{for await(let c of a())s||c&&(yield JSON.parse(c));s=!0}catch(c){if(c instanceof Error&&c.name==="AbortError")return;throw c}finally{s||r.abort()}}return new e(i,r)}[Symbol.asyncIterator](){return this.iterator()}tee(){let t=[],r=[],n=this.iterator(),a=i=>({next:()=>{if(i.length===0){let s=n.next();t.push(s),r.push(s)}return i.shift()}});return[new e(()=>a(t),this.controller),new e(()=>a(r),this.controller)]}toReadableStream(){let t=this,r,n=new TextEncoder;return new IN({async start(){r=t[Symbol.asyncIterator]()},async pull(a){try{let{value:i,done:s}=await r.next();if(s)return a.close();let c=n.encode(JSON.stringify(i)+`
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        

e [Error]: Your organization must be verified to use the model `o4-mini-2025-04-16`. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization.
    at e.a [as iterator] (file:///C:/Users/User/AppData/Roaming/npm/node_modules/@openai/codex/dist/cli.js:445:1514)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async R0.run (file:///C:/Users/User/AppData/Roaming/npm/node_modules/@openai/codex/dist/cli.js:462:2639) {
  status: undefined,
  headers: undefined,
  request_id: undefined,
  error: {
    type: 'invalid_request_error',
    code: 'model_not_found',
    message: 'Your organization must be verified to use the model `o4-mini-2025-04-16`. Please go to: https://platform.openai.com/settings/organization/general and click on Verify Organization.',
    param: null
  },
  code: 'model_not_found',
  param: null,
  type: 'invalid_request_error'
}

Node.js v23.5.0

I hope this can be made accessible.
I've tried regenerating the api key but that didn't work either.

And when I try another model like gpt-4o-mini, it hits me with 'insufficient_quota'

Hence, I have not been able to use codex and have not seen it work outside the youtube demo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @tibo-openai@nk3015

        Issue actions

          Unclear Running Instruction - Was not able to use. · Issue #251 · openai/codex