Post 'em
Do Your Own Damn Research
Me:
Him: :downs:
So, my phone rings. I pick it up. "<company>, this is Wolfy," I say.
:downs: "Yes, hello, Wolfy, I have a question about databases that I could ask you."
"Sure, what's up?"
:downs: "I am trying to write out <table> to a CSV file, and it won't let me write out files larger than 32 kilobytes."
Now, <table> can be in excess of 3 billion rows. If this doesn't illustrate my point, we have 8 programs that each process 1/8 of this table's child table, which can take up to a week and a half of constantly running. It's. Big.
However, it's not my place to question the reason why they wish to do this; I'm here to answer this gentleman's question. Now, I call malarkey on his claim that it limits to 32k files, but...again, whatever. I can't prove that it's not a problem, and this guy won't hear it.
So, I suggest to him that he process <table> in batches of 250,000 rows. Then, for each batch, he writes out a file. At the end of the program, merge all of the files together and present the user with the one merged file. I recommended he use cat and piping to do this.
:downs: "Okay, okay. You will e-mail me when you have found how to do this?"
"Uh...uh...well, I really don't have the time to research that for you."
:downs: "Okay, okay. I will then, uh, research this 'cat' in UNIX."
...the Hell? How do you get any kind of worthwhile degree in SE or CS without any UNIX or Linux experience? And to do his research for him? Really?
Do Your Own Damn Research
Me:
Him: :downs:
So, my phone rings. I pick it up. "<company>, this is Wolfy," I say.
:downs: "Yes, hello, Wolfy, I have a question about databases that I could ask you."
"Sure, what's up?"
:downs: "I am trying to write out <table> to a CSV file, and it won't let me write out files larger than 32 kilobytes."
Now, <table> can be in excess of 3 billion rows. If this doesn't illustrate my point, we have 8 programs that each process 1/8 of this table's child table, which can take up to a week and a half of constantly running. It's. Big.
However, it's not my place to question the reason why they wish to do this; I'm here to answer this gentleman's question. Now, I call malarkey on his claim that it limits to 32k files, but...again, whatever. I can't prove that it's not a problem, and this guy won't hear it.
So, I suggest to him that he process <table> in batches of 250,000 rows. Then, for each batch, he writes out a file. At the end of the program, merge all of the files together and present the user with the one merged file. I recommended he use cat and piping to do this.
:downs: "Okay, okay. You will e-mail me when you have found how to do this?"
"Uh...uh...well, I really don't have the time to research that for you."
:downs: "Okay, okay. I will then, uh, research this 'cat' in UNIX."
...the Hell? How do you get any kind of worthwhile degree in SE or CS without any UNIX or Linux experience? And to do his research for him? Really?
the idiot is the person who follows the idiot and your not following me your insulting me your following the path of a idiot so that makes you the idiot - LC Tusken