LLMs to aid in software development for NextStep/OpenStep

NeXT Computer, Inc. -> Porting New Software

Title: LLMs to aid in software development for NextStep/OpenStep
Post by: pTeK on October 31, 2023, 06:32:59 PM
Please post your LLM's here so that other users can play around with them.


Saw this one mentioned on hacker news.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: pTeK on December 16, 2024, 01:22:49 PM
Perplexity.ai

Just saw this one mentioned here https://www.theregister.com/2024/12/16/opinion_column_perplexity_vs_google/ (https://www.theregister.com/2024/12/16/opinion_column_perplexity_vs_google/).

Asked it:
How to code a IDE driver in Objective C for NeXtstep 3.3
I do like the references to the NeXTSTEP Developer documentation listed on www.nextop.de.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: Nitro on December 16, 2024, 02:38:11 PM
GPT-4
https://openai.com/chatgpt/

GPT-4 has helped me to write or update NeXT related programs. Works well if you are very specific with your requests. It can also lead you down a rabbit hole if you're not careful.

Llama 3
https://www.llama.com/ (https://www.llama.com/)
https://www.techradar.com/computing/artificial-intelligence/best-large-language-models-llms-for-coding (https://www.techradar.com/computing/artificial-intelligence/best-large-language-models-llms-for-coding)

Unfortunately the local machine that Llama is loaded on doesn't have enough vram to motor through the code.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: itomato on December 17, 2024, 06:17:40 AM
Oh, cool. Here's my "Openstep Developer" GPT - https://chatgpt.com/g/g-pY1ouRInc-openstep-developer

It has knowledge files for DriverKit development and low-level service and platform details for extending Openstep - part of my "Reopenstep" project.

I also have one that is a little less forward-looking.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: Nitro on December 17, 2024, 10:28:57 AM
Quote from: itomato on December 17, 2024, 06:17:40 AMOh, cool. Here's my "Openstep Developer" GPT - https://chatgpt.com/g/g-pY1ouRInc-openstep-developer

It has knowledge files for DriverKit development and low-level service and platform details for extending Openstep - part of my "Reopenstep" project.

I also have one that is a little less forward-looking.

Bonus points for the reference to slipstreaming Y2K patches.  :)
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: pTeK on December 18, 2024, 01:50:00 AM
Quote from: itomato on December 17, 2024, 06:17:40 AMOh, cool. Here's my "Openstep Developer" GPT - https://chatgpt.com/g/g-pY1ouRInc-openstep-developer

It has knowledge files for DriverKit development and low-level service and platform details for extending Openstep - part of my "Reopenstep" project.

I'm taking this is pay to win?! I'm only using the free version of GPT so I can't access your trained one :/
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: itomato on December 18, 2024, 05:47:55 AM
Quote from: pTeK on December 18, 2024, 01:50:00 AMI'm only using the free version of GPT so I can't access your trained one :/

It wasn't shared apparently - you should be able to use it if you're logged in.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: pTeK on March 03, 2025, 03:57:38 AM
Any updates on this? I was watching a Dave's Garage video on youtube and it was showing him adding more data to a LLM. Just wondering if any one has had experience after feeding it the source code from Darwin 0.3 and the developer kit and other NeXTSTEP source code.

My hardware is very limited (2019 cheap laptop, 4GB RAM soldered to mother board...)
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: user217 on March 04, 2025, 04:51:37 PM
In one of his more recent videos Dave Plummer showed off an RTX 6000 Ada—a $10k GPU used to create model finetunes. The budget options he shows off are with much weaker models, like the (somewhat deceptively named) 7B Qwen distill of DeepSeek. I don't think anything he shows off for local model hosting is directly applicable to the average hobbyist. A consumer GPU can't really provide enough context for RAG (retrieval-augmented generation) unless you're querying a single legal document; basically you're looking at about 16384 tokens (words) of memory, which just isn't enough for API documentation, much less a kernel codebase.

Probably the best tool available for this kind of problem is something like https://notebooklm.google.com/ which provides a workspace for topical/expert knowledge work. (Although most people just use it for the "generate podcast meme" button.) It is also—conveniently—still free. I have no idea what kind of context size it supports or what to expect from its vectorization abilities, but it's probably a better place to start than anything mentioned here so far.
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: itomato on April 04, 2025, 10:25:00 AM
Quote from: pTeK on March 03, 2025, 03:57:38 AMJust wondering if any one has had experience after feeding it the source code from Darwin 0.3 and the developer kit and other NeXTSTEP source code.

Yes, I did this with my Developer GPT above, and also to explore a rabbithole related to the origins of MANPAGER.

https://mastodon.sdf.org/@awb/114078123638942223

https://chatgpt.com/share/67eff94e-6648-8007-841e-4c25ad44521b
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: rjnf on June 27, 2025, 04:58:20 PM
This was given by super grok paid version
But i see phind is better

#import <driverkit/IODevice.h>
#import <driverkit/IODisk.h>

@interface IDEDriver : IODisk
{
    unsigned short basePort; // Assume 0x1F0
    unsigned short irq;      // Assume 14
}

- (BOOL)probe:(IODeviceDescription *)desc;
- (IOReturn)startIO:(IORequest *)request;
- (void)interruptOccurred:(int)irq;
- (IOReturn)readSector:(unsigned)sector buffer:(void *)buffer;
- (IOReturn)writeSector:(unsigned)sector buffer:(void *)buffer;

@end

@implementation IDEDriver

- (BOOL)probe:(IODeviceDescription *)desc
{
    basePort = 0x1F0; irq = 14;
    outb(basePort + 7, 0xEC); // IDENTIFY DEVICE
    if (inb(basePort + 7) == 0 || inb(basePort + 7) == 0xFF) {
        IOLog("No IDE at 0x%x\n", basePort); return NO;
    }
    [self registerInterrupt:irq];
    IOLog("IDE at 0x%x, IRQ %d\n", basePort, irq); return YES;
}

- (IOReturn)startIO:(IORequest *)request
{
    if (request->read) return [self readSector:request->sector buffer:request->buffer];
    else return [self writeSector:request->sector buffer:request->buffer];
}

- (void)interruptOccurred:(int)irq
{
    if (inb(basePort + 7) & 0x08) IOLog("Data ready interrupt\n");
}

- (IOReturn)readSector:(unsigned)sector buffer:(void *)buffer
{
    outb(basePort + 6, 0x00); outb(basePort + 2, 1);
    outb(basePort + 3, sector & 0xFF); outb(basePort + 4, (sector >> 8) & 0xFF);
    outb(basePort + 5, (sector >> 16) & 0xFF); outb(basePort + 7, 0x20);
    while (inb(basePort + 7) & 0x80); // Wait for BSY clear
    if (inb(basePort + 7) & 0x08) { insw(basePort, buffer, 256); return IOReturnSuccess; }
    return IOReturnIOError;
}

- (IOReturn)writeSector:(unsigned)sector buffer:(void *)buffer
{
    outb(basePort + 6, 0x00); outb(basePort + 2, 1);
    outb(basePort + 3, sector & 0xFF); outb(basePort + 4, (sector >> 8) & 0xFF);
    outb(basePort + 5, (sector >> 16) & 0xFF); outb(basePort + 7, 0x30);
    while (inb(basePort + 7) & 0x80); // Wait for BSY clear
    if (inb(basePort + 7) & 0x08) { outsw(basePort, buffer, 256); return IOReturnSuccess; }
    return IOReturnIOError;
}

@end
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: Nitro on August 10, 2025, 05:28:57 PM
How I Code with AI on a budget/free

https://wuu73.org/blog/aiguide1.html (https://wuu73.org/blog/aiguide1.html)
Title: Re: LLMs to aid in software development for NextStep/OpenStep
Post by: pTeK on August 11, 2025, 10:53:03 PM
Any one been using these lately for NS stuff?

Go to top  Forum index