Cincai Patron wrote:
void tokenizer(const char line[], const vector<string>& field) { for (..) { char *c = new char[size]; strncpy(c, ...); field.push_back(c); delete [] c; } field.push_back(...); //last statemetn }
How did you get this to compile at all? The compiler should have barfed at the first field.push_back() because field is constant. I suspect the problem is being caused well away from where it's appearing. Segfaults tend to do that. The nearest suspect I see is strncpy. I think field.push_back(c); delete [] c; is OK because this should get converted implicitly to field.push_back(std::string(c)), which copies c before it's deleted. If you're trying to read a line of data and separate it into tokens, there are some useful functions in iostream or the standard template library. If I recall correctly, something like the following will work std::copy( std::istream_iteratorstd::string( std::cin ), std::istream_iteratorstd::string(), std::back_inserter( field ) ); -- JDL Non enim propter gloriam, diuicias aut honores pugnamus set propter libertatem solummodo quam Nemo bonus nisi simul cum vita amittit.