People should use less Ai, and learn more how to program
Yes. Once you know how, you can see pitfalls with AI.
Interesting, but I never needed AI for coding. Well, twice, and I had to do changes, but would not use AI to generate code.
My small Python (~100 lines of codes) codes aren’t maintainable, but I’m happy with them. I don’t ever plan to work on serious projects with Python, so I can’t say much about it’s maintainability. But, from limited experience, I’d rather use C++, C#, or in my special case, G’MIC if maintainability matters to me.
In my experience, it is.
I had converted a Python code into G’MIC, and then some one else did a Python version of my own code. G’MIC is interpretative with JIT math parser. The results:
Reversing digits in a 1024x1024 RGB image.
Python: Without lookup table and numpy - 3+ minutes
Python: With lookup table and numpy - 6.5 s (Some one else machine, but it shouldn’t take that long)
G’MIC: Without lookup table - .3 s
G’MIC: With lookup table - .005 s
And I did Lavander Binary Map on my machine, you can find code for Python version in github/gmic-community/include/reptorian.gmic:
Python: 3 s (Without lookup table)
G’MIC:.15 s (Without lookup table)
G’MIC: .05 s (With lookup table)
Honestly, I find Python pretty bad for image processing in general.
I use Levels filter tool for that in Krita. Already non-destructive.
Scala does look nice. Just a quick syntax view makes me want to give it a whirl when I want an alternative to Python. I used to code in C++, and C#. I use G’MIC (DSL) as my main. Scala seems right up my alley.
Yes. <center></center>
isn’t part of HTML5. It is part of HTML4 though.
Yes, something like that. I provided a spoiler example recently. And I would definitely like to be able to adjust what’s going to be rendered by editing on the rendered viewport.
When I do commit, I write up the title of what I did, and describe it, and then use periods for related commits. Just easier.
This existed? Meh.
I’m just glad I have other options than just Python. Am not afraid of writing my solutions either. I rarely use Python these day.
For small projects, rewriting is often superb. It allows us to reorganize a mess, apply new knowledge, add neat features and doodads, etc.
This. I’m coding to contribute to a open-source software with very small amount of coders, and with a non-mainstream Domain-Specific Language. A lot of the code I did before has been proven to work from times to time, but they all could benefit from better outputs and better GUI. So, I end up reengineering the entire and that’ll take a really long time, however, I do a lot of tests to ensure it works.
I have to say, I really like the concept behind this. May be another tool for parsing strings I have besides Python.
I don’t understand your problem well enough to know, if you can (or want to) use this here, but you might be able to tap into that C performance with the radix conversion formatting of printf.
The problem is printing big binary to decimal. That’s not a easy problem because 10 is not a power 2. If we live in a base-hex world, this would be very easy to solve in O(n).
Also, I can’t access that as G’MIC is a language that can’t really communicate with other language as it’s not meant to share memory.
This could be an XY problem, that is, you’re trying to solve problem X, rather than the underlying problem Y. Y here being: Why do you need things to be in decimal in the first place?
I wouldn’t say it’s needed, but this is more of a fun thing for me. The only thing I’m using this is for Tupper’s Self-Referential formula, and my current approach of converting base 1>>24 to base 1e7 works instantly for 106x17 binary digits. When I load a image to that filter that’s greater than somewhere over 256x256, delays are noticeable because the underlying algorithm isn’t that great, but it could have to do with the fact that G’MIC is interpretative, and despite the JIT support in it, this is not the kind of problem it’s meant to solve (Domain-Specific). On the bright side of thing, this algorithm will work with any data type as long as one data type is one level higher than the other, and in this case, I’m using the lowest level (single and double), and the bigger data type, much faster it can be.
Even simpler is repeat 10 { }
} just stands for done.
I don’t think we do have a difference in opinion. What I’m saying is that some apps are done with many years of development, and in those case, C++ will likely be the only realistic option because it is way more time-consuming to switch. For example, Krita. I do agree that when there’s a choice, C++ is less relevant these day.
G’MIC solution
spoiler
it day2 crop. 0,0,0,{h#-1-2} split. -,{_'\n'} foreach { replace_str. " ",";" ({t}) rm.. } safe_0,safe_1=0 foreach { ({h}) a[-2,-1] y num_of_attempts:=da_size(#-1)+1 store temp repeat $num_of_attempts { $temp if $> eval da_remove(#-1,$>-1) fi eval " safe=1; i[#-1,1]>i[#-1,0]?( for(p=1,p<da_size(#-1),++p, if(!inrange(i[#-1,p]-i[#-1,p-1],1,3,1,1),safe=0;break();); ); ):( for(p=1,p<da_size(#-1),++p, if(!inrange(i[#-1,p-1]-i[#-1,p],1,3,1,1),safe=0;break();); ); ); safe;" rm if $> if ${} safe_1+=1 break fi else if ${} safe_0,safe_1+=1 break fi fi } } echo Day" "2:" "${safe_0}" :: "${safe_1}