I would say Redox is miles ahead of Minix, am I wrong?
I would say Redox is miles ahead of Minix, am I wrong?
Don’t write “if” in your tests! It makes very, very little sense: how is that, you test your application and you are unsure what is the resulting outcome of a call? Is it depending on arguments? Then fix the argument, and expect 1 specific result. Is it depending on environment? Fix/mock the environment.
No “ifs” in the tests!
Okey that definitely explains it
Banned for being linked to Russian state, or for being Russian? Lol those are very very different
kids with large bodies do less good in large buildings with teachers, end up earning less valuable currency later in smaller cubicles without teachers
More valuable information soon , stay tuned
I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I wouldn’t bet my eye on it, but who knows!.. Maybe he was a better teacher before!
Again, Knuth himself said in a preface that Volumes 2 through 5 are independent.
That sounds interesting, will take a look. I am not against theoretical computer science, i just think Knuth doesn’t reads like a good teacher…
Because volume 1 is not available in the library
Edit: but also the volumes aren’t not dependent on each other. They treat very different topics, i doubt reading Volume 1 will help with Volume 4.
I don’t understand why this is called a “subset”, while clearly containing new syntax
A subset would be understood by older compilers, this is a superset
Absolutely not a replacement to VBA. Not even close. As usual, Microsoft hypes something everyone wants, and then implements something nobody asked
I feel offended by you somehow equalizing perl and lisp
This a much better done meme
The other one before makes zero sense
2100 parameters is a documented ODBC limitation( which applies on all statements in a batch)
This means that a
“insert into (c1, c2) values (?,?), (?,?)…” can only have 2100 bound parameters, and has nothing to do with code, and even less that surrounding code is “spaghetti”
The tables ARE normalised, the fact that there are 50 colums is because underlying market - data calibration functions expects dozens of parameters, and returns back dozens of other results, such as volatility, implied durations, forward duration and more
The amount of immaturity, inexperience, and ignorance coming from 2 people here is astounding
Blocked
You should take a break from trolling
I timed the transaction and opening of the connection, it takes maybe a 100 milliseconds, absolutely doesn’t explain ghe abysmal performance
Transaction is needed because 2 tables are touched, i don’t want to deal with partially inserted data
Cannot share the code, but it’s python calling .NET through “clr”, and using SqlBulkCopy
What do you suggest i shouldn’t be using that? It’s either a prepared query, with thousands of parameters, or a plain text string with parameters inside (which admittedly, i didn’t try, might be faster lol)
Will try bcp & report back EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
I will try bcp. Somehow, i was convinced I had to have access to the machine running the sql server to use it, but from the doca i see i can specify a remote host… Will report back! EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem